Analysis the Cloud Computing Algorithm with the Cryptography Based Identity Management

Enhancing Data Security in Cloud Computing

by Sonia Narang*, Dr. Kalpana Midha,

- Published in Journal of Advances and Scholarly Researches in Allied Education, E-ISSN: 2230-7540

Volume 14, Issue No. 2, Jan 2018, Pages 1356 - 1363 (8)

Published by: Ignited Minds Journals


ABSTRACT

Cloud computing is an approach to convey IT services on request and pay per utilization, and it can stores gigantic measure of data. Clients just consider how to spare their very own touchy data from being taken and when they use encryption algorithms like RSA Encryption Algorithm or Triple DES Encryption Algorithm, time for encrypting or decrypting encrypted data is longer. RSA is a cryptography technique used to encrypt and decrypt data, RSA is a hilter kilter cryptographic algorithm, which means that companies must use two separate keys to encrypt data, one is a public key, and the other is a secret key. RSA is generally a moderate encryption algorithm. Interestingly, Data Centric Security (DCS) is a developing methodology that expects to give data proprietors full control of their data security from inside the data itself, all through the data's lifecycle on the cloud.

KEYWORD

cloud computing, algorithm, cryptography, identity management, data-centric security, IT services, encryption algorithms, RSA encryption algorithm, Triple DES encryption algorithm, data storage

1.1 INTRODUCTION

Cloud computing offers a pool of shared assets, including additional information space, infrastructure, PC planning resources, and integrated business and client applications. Because of these benefits, every relationship. Thusly, there is a need to guarantee that data against unapproved administration access, change or disavowal, and so on. Security objectives of information incorporate three focuses in particular: Availability, Security, and Integrity. The security of knowledge in the cloud is maintained through cryptography. Cryptography, in present days is viewed as blend of three kinds of algorithms. They are (1) Symmetric-key algorithms (2) Asymmetric-key algorithms and (3) Hashing. Integrity of information is guaranteed by hashing algorithms. Mainly, information cryptography is scrambling the information material, such as content documents, pictures, photographs, video, vibrations, recordings and so on to create the data indistinguishable, imperceptible or aimless during transmission or ability. Cryptography's fundamental point is to deal with intruder-secure information. The opposite method for extracting the first information from encoded information is Decryption, which recovers the first details. It is conceivable to utilize both symmetric-key and asymmetric-key calculations to scramble data in distributed storage. The IDS assume significant job in field of data security. Intrusion Detection system finds meddling exercises among all typical and irregular conduct. Intrusion Detection system screens network data packets and network traffic to discover attacks and intrusions. The concept of IdM systems were introduced to cope with the data and user security. The data and the user security were within the same network. Services are provided as per the consumer‘s requirement by the Cloud Service Provider (CSP). The authentication is carried out when the consumer provides sensitive information to the CSP. Data can leak or fall into the wrong hands through an unauthorized consumer. In order to ensure the integrity of user information, IdM is of the utmost importance. Additional security has been implemented by using the cryptographic method to encrypts and decrypts the user credential details. In this research, security has been enhanced by using the various cryptographic techniques. These techniques work with the federation standard protocols such as OpenID, OAuth and Security Assertion Markup Language (SAML) to establish security, privacy, data confidentiality and also maintain the trust among the consumer and the SPs.

1.2 IDENTITY MANAGEMENT

In cloud, access of services is done by using the user identities. These user identities are often hacked by unauthorized access. This leads to decreased security in the cloud [1]. An issue of trust between the user and the SP is thus created. But this can be overcome with the help of FIM which uses OpenID, OAuth and SAML technology [1] to establish authorization and authentication during the service request process. In order to access a service

access to the cloud resources and the services takes place. An entity can imitate a legitimate user and access a cloud service. It can lead to a number of entities acquiring the cloud resources leading to unavailability of a service for actual user. The user may also tend to cross his boundary when using the services at his time of usage in the cloud.

Figure 1.1 Federated Identity Management

Federated Identity management manages the user identities with the help of authentication and access token. When the user wants to access resources from the other organization, their details and identities need to be verified for requesting the resources and the user information should be authenticated. The federation members analyze the user attributes such as name, title, role which is used to determine the policies and the related access permission details. So, the user credentials are authorized in the home organization and the federated members will provide the details about the user to another organization for accessing the resources. In this manner, privacy and security is maintained by user centric model.

1.3 IMPORTANCE OF CRYPTOGRAPHIC BASED IDENTITY MANAGEMENT CLOUD SECURITY

Some of the challenges faced in cloud include interoperability, matters related to security, feasibility, etc. Some suggestions and solutions to face these challenges have been proposed through the use of encryption. One of the major issue which is of worry to those who wish to keep their data and processes in an external source is none other than security. Ten obstacles to the improvement of cloud computing was highlighted and the ways to recover them were also presented. Confidentiality of data is single of the hindrance and the suggestion for it is was data encryption as an opportunity for resolution. Identity Management participate a significant role in cloud computing because, FIM eliminates the user credential maintenance for each resource Provides the • FIM is used to develop IT industry with appropriate authorization and • authentication mechanism. • Provides reliable resource access mechanism in different locations • Eliminates user credential replication database • Increases the security, confidentiality and privacy while accessing the resources

SECURITY ALGORITHMS

The major function of these security algorithms is to use cryptography to secure any data or information that is being broadcast over the network. In addition, cloud data security is classified into three different categories, such as Privacy Preservation, which defines the privacy of personal and important cloud information as crucial because cloud servers are not trusted[3]. The term cryptography alludes to the approach altering above plain content into cyphertext where plain content is the normal language which is effectively coherent by people and cyphertext is the data which has been encrypted and isn't clear by human or even a PC, except if it is decrypted utilizing a right figure. These distributed storage security algorithms can be extensively characterized into-symmetric and asymmetric algorithms as appeared in the Fig.1.2 underneath.

Fig1.2. Classification of Cloud Storage Security Algorithms

1.3.1 SYMMETRIC ALGORITHMS

Symmetric algorithms are otherwise called Secret key encryption or symmetric key encryption. On this, the encryption and decoding of data is carried out using a comparable key and is therefore kept secret. In Block Figure, object pieces are encrypted, effect.

A. DES

DES calls for Data Encoding Protocol. It is a block cipher which was initial created in 1997 and was the first cipher standard proposed by the NIST-National Institute of Standards and Technology. The main size and block size of the DES algorithm is 64 bit, so it's a symmetrical algorithm. When 64 bits of plain text is sent to the DES, 64 bits of cipher text is received as output. The code and key used for encryption and decryption are almost the same, with a few variations. B. Blowfish This method is most widely used and was created by Bruce Schneier in 1993. Blowfish uses a 32-448-bit variable duration key to encrypt 64-bit binary clocks. No attack against this algorithm has yet been successful. This method has a higher efficiency and power consumption than the others. The height of the Blowfish block is 64 level. Blowfish is not ideal for systems where key changes sometimes occur, such as packet switching. This is appropriate if the key is not modified regularly, such as the encryption of the Communications connection.

C. RC2,4,5,6

It is a block cryptography algorithm to variable block sizes and key sizes created by Ronald Rivest (RSA Labs). Attackers, without knowing the original size of the plain text, find it difficult to decode the captured details. These algorithms are commonly used in the SSL protocol to secure websites.

3DES

Triple DES is a chain encryption method that basically increases the main value of the DES. Use the encryption algorithm three times dynamically on three different keys. The aggregate size of the key is 168 bits (3 times 56).

A. AES

AES is the Enhanced Encryption Standard. It is the new encryption standard proposed by NIST to be used instead of DES. The only threat that can crack through this code is the assault by the Brute Force. In this assault, hackers are trying to unlock encryption by checking all character combinations. Like the DES, the AES is a block cipher, too. The key length is 128,192 or 256 bits. 128-bit data blocks are encrypted in sizes 10, 12 and 14 depending on size of the switch. The advantage of this algorithm is that it's very simple, scalable and could be increased to multiple platforms, such as small devices. Asymmetric algorithms are likewise alluded to as open key cryptography. Two unique keys-the open key and the private key-are utilized in this encryption methodology. While people in general is utilized for data encryption and everybody realizes that the private key is utilized for unscrambling and is known uniquely to the proprietor. The principle preferred position of open key cryptography is that it doesn't confront the issue of key distribution[4]. The drawback of this, however, is they are to some degree more slow than symmetric algorithms on account of the immense measure of intensity required by their activity.

A. RSA

The RSA calculation was created by Ronivest, Adi Shamir and Leonard Adleman in 1977. In this strategy, the open key is traded with any individual who can utilize it to scramble the message to be sent, and the private key is stayed quiet by the client and not imparted to anybody. In this calculation, plain content and figure content are largely whole numbers among zero and n-1 for some n. The plain content is encrypted in blocks utilizing the accompanying equation: C= Me mod n where C is the figure content and M is the plain content. In like manner, the plain content is gotten by applying the equation: M= Cd mod n, where d is the private key.

B. DSA

The Digital Signature Algorithm (DSA) was recommended by NIST to be used in its Digital Signature Standard (DSS) in 991 and was also adopted as FIPS 186 (Federal Information Processing Standard) in 1993. The new expanded edition was published as FIPS 186-4 in 2013. There are two steps for main generation in this. The first step is the set of parameters that can be exchanged by different users of the system. The second and final phase is the calculation of the private and public keys. It is essential that perhaps the random value of the signature k is entropy, secrecy, and uniqueness. It's just so essential that even the leak of any such three circumstances reveals the whole private key to the attacker.

C. Diffie-Hellman Key Exchange (D-H)

This method was also developed by Whitfield Diffie and Martin Hellman. This method is used in a public network to safely share cryptographic keys. In this, two users can share a common secret key using a secure network. It needs two huge numbers, one of that is prime (P) and the another is primitive P root (G).

calculation that depends on the Diffie-Hellman key calculation. It was proposed in 1985 by Taher Elgamal. ElGamal encryption comprises of three segments: the key generator, the encryption calculation, and the decoding calculation. This procedure includes an additional layer of security by asymmetrically scrambling the keys that have been recently utilized for symmetrical purposes.

1.4 CRYPTOGRAPHY

Cryptography is characterized as the specialty of encoding data using a key for the purpose of interpreting and getting to the data by solitary authorized clients[5]. That is a plain data ought to be encrypted figure data before moved to ensure security necessities. On the opposite side the data decrypted to be fit to be prepared in the fundamental application. In cryptography, encryption is the route toward changing data using a figuring to make it questionable to anyone except for those having phenomenal data, generally suggested as a key. The aftereffect of the procedure is encrypted data. In numerous unique situations, the word encryption additionally verifiably alludes to the switch procedure, decoding, to make the encrypted data comprehensible once more.

Figure 1.3 Basic cryptographic system

1.5 TYPES OF CRYPTOGRAPHY

Public key cryptography and private key cryptography can be divided into two groups. 1. Private Key Cryptography 2. Key Distribution and Management 3. Public Key Cryptography Techniques a. Rivest, Shamir, Adleman (RSA) b. Diffie Hellman Key Exchange c. Elliptic Curve Cryptography (ECC) 4. Digital Signature NIDS is an intrusion detection device which aims to identify malicious activity like DOS attacks; device scans or perhaps efforts to hack machines through tracking network traffic. The NIDS recognizes both received packets and helps to detect unusual patterns. Many attacks may even take place from the inside of the network or network segment being controlled and are therefore not considered to be incoming traffic. Network intrusion detection systems also often work with other systems. For example, they may update the blacklist of some firewalls with the device IP addresses used by crackers[6].

1. HOST INTRUSION DETECTION SYSTEMS

HIDS is designed to examine only one host or computer at a time, the main difference that distinguishes it from NIDS. The HIDS is a program or agent that is installed like any other software package on a host computer except that it runs in the background, at the same time as any other applications or processes that are active at any time. HIDS can be configured to examine all activities on a computer, from failed log-in attempts to recording individual keystrokes, in order to create a comprehensive real-time picture of the activities of an individual[7] like NIDS.

2.2.1 Cryptographic Algorithms Considered

Some of the cryptographic algorithms considered for the proposed systems were mainly TDES, HBE and Fuzzy Based Blowfish algorithm. These are explained below.

2.2.1.1 Triple DES

The Triple Data Encryption Standard (Triple DES) is among the symmetric key block ciphers that relates three times the data encryption level for each block. This cryptography technique uses three different keys, each has 56 bits and it has to be varied during the additional security obtained process.

2.2.1.2 Hash Based Encryption Algorithm

Hash based Encryption (HBE) is one of the block cipher encryption algorithm that is used to ensure security in a cloud environment[8]. The HBE algorithm uses 128 bit plaintext and key with different length such as length 128 bit, 192 bit and 256 bit. This algorithm utilizes the key encryption protocol with a poly message during the authorization and authentication process in the public key environment. In addition, this algorithm has four different rounds, namely Replace Bytes, Shift Rows, Mix Columns, and AddRoundKey. Both either the AddRoundKey, preceded by nine rounds, every round executes these four steps, which assist with the authorization and authentication phase. Hash based Encryption algorithm can thus be used to encrypt user credential details using four different stages with the S-box operation. This process can be repeated for all the user identities which improves security, privacy, confidentiality to the user credential values. The encryption algorithm enhances the authentication, authorization process[9].

2.2.1.3 Fuzzy Based Blowfish Encryption Algorithm Fuzzy based Blowfish Algorithm (FBFE) is one of the highest secure cryptography techniques which uses the linguistic variable and fuzzy function. The FBFE algorithm has three different stages, namely, setup, encryption and decryption which have 64 bit plain text and variable key length. The plain text and key is used to perform the XOR operation by S-Box function and the fuzzy based variable and function are used to identify the encryption and decryption code. This process is repeated for next16 rounds that are used to establish the security in the cloud environment.

2.3 IDENTITY MANAGEMENT MODELS

Identity management is of utmost importance in the present technological scenario. People are very concerned on how their personal data will be used whenever they log on to or create ids for various applications or uses. It is possible to have different types of models in identity management, they can be done in a centric manner, in a federated manner or as a user-centric manner[10] . 2.3.1 Central Model In this model, the data related to the identity is stored centrally, i.e. it is held in a central database which could be either at the SP or IDP site. Before the user can be allowed to use a service or resource, it is essential that the users register, which can be done at an IDP. After this has been done, the user identity data is then managed and kept in central databases in the IDP‘s site. In order to use a particular resource or application from a SP, successful authentication of the user should have taken place at the IDP. It is only then that the IDP will send the user identity data to the SP. In the central model, the user has absolutely no control on the data that is stored or what is being transmitted to the site that is requesting information about the user[11].

2.3.2 User Centric Model

In a user-centric model, the user‘s identity data are stored in the user‘s domain. The user has exclusive rights over his/her data. The data can be transferred only if the user provides consent for the same. Over

identity data. This can be managed in several ways. The user can use a smart card and then the data can be transferred if the user allows the data to be transferred to the SP. The communication is a one on one kind wherein the user and SP communicates directly, and no third parties are involved here[12]. 2.3.3 Federated Model

Over here, the user identity data can be found in different IDPs which have a common trust amongst them. The data are not stored in a central database but rather it is stored in different databases and these databases are linked and thus it is easy for the data to be verified and exchanged. The identity information is not controlled by a single entity. A common identifier will help in finding a particular user‘s identity data. A common relationship trust is present among the IDPs and SP‘s. Thus this aids in authentication and identification among various models. This federated model particularly supports identification and authentication across different areas. This data can also be encrypted and stored at the IDPs or they can also be encrypted by the user[13].

2.4 FEDERATED STANDARD PROTOCOLS

Although a number of Federated standard protocols are available now, the protocols used for this research work are OpenID, OAuth and SAML technology. The description of the various protocols is given in this section. It also explains how they work in the IdM systems. 2.4.1 OpenID

OpenID is an open standard protocol that provides centralized identity framework for user authentication, which is done with the help of the third party services[15]. It allows authentication of users with the help of other helpful sites, which are also called as IdPs or RPs. OpenID has four different layers such as user identifier, discovery mechanism, authentication server and a messaging service. Some of the security concerns with OpenID are as follows: ► Eavesdropping attack: It is plausible for the eavesdropper to decrypt an effective assurance of authentication. If the cryptographic hash is not checked, it may use such a authenticated assumption again. ► Denial of Service (DoS) Attack: DoS attack is another issue that crops up in OpenID. A rogue RP can launch a DoS attack, because of the security lapse of the OpenID protocol, against another OIP. Since the messages that are received by the OpenID protocol

requests for repeated associations, signature verification or authentication, such a situation can arise[16]. ► Man in the Middle (MITM) Attack:With the help of associations, it is possible to change the signed fields and thus prevent MITM attacks. But there are cases when this happens at the time of association sessions, discovery and direct verification. In such a case, a DNS that is compromised DNS can allow an attacker to act like an OIP and make decisions or issue associations. Impersonation will not be needed if during the discovery process, the attacker allows the specification of any OIP. ► Phishing attack: This is another issue prominent in OpenID. A fake RP is created by the phisher and this is same like the real RP. The phisher then directs the user to the fake page. The user will then enter his OpenID and then will be taken to the fake page related to the OIP, which in turn asks for the user‘s password. The user on providing this information has thus given his password to the phisher. Now, the phisher has all the information that he/she needs to access the services, i.e., the OpenID URL and the password[17].

2.4.2 OAuth

OAuth is an Open Authorization protocol, which permits the application or website to access the user information without waiting for the user login credential shares. It differs from OpenID and SAML standards[18]. OpenID and SAML support authentication, whereas OAuth is only for authorization purposes. The OAuth standard works with the help of HTTP protocol that enable the application to access other application data. This is maintained via an API and is used by Plaxo, Google, Facebook and Twitter. Some of the security concerns with OAuth are as follows: ► Brute Force Attacks Against the Server: An intruder with a network entry would likely be able to listen to a visitor and access unique request parameters and attributes such as oauth signature, consumer key, oauth token, signature method (HMAC-SHA1), timestamp, or custom parameter information. Creating tokens and shared-secrets and techniques which might be lengthy, random and resistant to those varieties of attacks can decrease this threat[95]. utility in which they are at present verified. CSRF attacks principally objective state-evolving demands, not robbery of data, in see that the aggressor has no choice to see the reaction to the cast solicitation. With assistance of social designing, (for example, sending a hyperlink by means of email or talk), an aggressor could deceive the clients of a web programming into executing moves of the assailant's picking. In the event that the sufferer is a typical client, a triumphant CSRF attack can drive the buyer to perform state modifying demands like moving assets, adjusting their email handle, and so on. In the event that the sufferer is a managerial record, CSRF can bargain the whole web utility.

2.4.3 SAML

SAML was one of the Extensible Markup Languages (XML) however was set up by the Security Services Technical Committee of the Organization for the Advancement of Structured Information Standards (OASIS). This is used to establish the authorization and authentication between the user and the IDP. SAML is derived from XML. It was mainly created to enable SSO for web applications. The IDP is the one who verifies the end user‘s identity. SAML can be utilized in organizations and it needs to be setup by the help of public key and SAML URL.

Some of the security concerns with SAML are as follows: ► Denial of Service (DoS) Attacks: SAML is vulnerable to DoS attacks. As a result of the processing price required for SAML requests (including parsing of request and assertion development), the responder can probably be flooded with requests, which do now not take just about as so much effort to construct[19]. ► Eavesdropping: It's feasible that an eavesdropping celebration could accumulate both the soap message containing a request and the soap message containing the corresponding response. This acquisition exposes both the character of the request and the main points of the response, almost certainly together with one or more assertions. A possible countermeasure is to furnish some form of in-transit confidentiality.

In this research, the user identities are managed by using the cloud federation standards and the cryptographic techniques which improve the security, privacy, confidentiality and trust in the cloud environment. The security, privacy and trust is implemented in three different phases, namely OpenID and Triple DES encryption algorithm, OAuth and Fuzzy based Blowfish algorithm and SAML and Hash based Encryption algorithm finally the comparison phase in which the better federation standard is analyzed to utilize in the cloud environment. The research work is divided into four phases and each phase is briefly explained on how federation standards achieve security in the cloud.

REFERENCES

[1]. Ranjith, D., & Srinivasan, J. (2013). Identity Security Using Authentication and Authorization in Cloud Computing. International Journal of Computer & Organization Trends, 3(4), pp. 122-129. [2]. Singla, S., & Singh, J. (2013). Cloud Data Security using Authentication and Encryption Technique. International Journal of Advanced Research in Computer Engineering & Technology,, 2(7), pp. 2232-2235. [3]. Banyal, R. K., Jain, P., & Jain, V. K. (2013). Multi-factor Authentication Framework for Cloud Computing. International Conference on Computational Intelligence,Modelling and Simulation in IEEE, (pp. 105 – 110). Seoul. [4]. F, K. L., Moghaddam, F., Moghaddam, S. G., Rouzbeh, S., Araghi, S. K., & Moghaddam, N. F. (508-513). A scalable and efficient user authentication scheme for cloud computing environments. Region 10 Symposium in IEEE. Kuala Lumpur. [5]. Uruena, M., & Busquiel, C. (2014). Analysis of a Privacy Vulnerability in the OpenID Authentication Protocol. Multimedia Tools and Applications, 68(1), pp. 159-176. [6]. Venigalla, S. P., Babu, M. N., Boddu,, S., & Vemana, G. S. (2012). Implementation Of The Triple-Des Block Cipher Using VHDL. International Journal of Advances in Engineering & Technology, 3(1), pp. 117-128. [7]. Tassanaviboon, A., & Gong, G. (2011). OAuth and ABE based Authorization in Semi- Trusted Cloud Computing. DataCloud-SC Proceedings of the second international workshop on Data intensive computing in the clouds, ACM, pp. 41-50. Trust in Workflows and Relevant Contexts. Communications Surveys & Tutorials in IEEE, 14(3), pp. 911–940. [9]. Vossaert, J., Lapon, J., Decker, B. D., & Naessens, V. (2013). User-Centric Identity Management using Trusted Modules. Mathematical and Computer Modelling, 57(7-8), pp. 1592–1605. [10]. Nepal, S., Ranjan, R., & Choo, K.-K. R. (2015). Trustworthy Processing of Healthcare Big Data in Hybrid Clouds. Cloud Computing in IEEE, 2(2), pp. 78 – 84. [11]. E , S., & M., U. (2013). User-Centric Trust based Identity as a Service for federated Cloud Environment. Computing, Communications and Networking Technologies (ICCCNT), 2013 Fourth International Conference on. Tiruchengode. [12]. Cao, N., Wang, C., Li, M., Ren, K., & Lou, W. (2014). Privacy-Preserving Multi-keyword Ranked Search over Encrypted Cloud Data. IEEE Transactions on Parallel and Distributed Systems, 25(1), pp. 222–233. [13]. Celesti, A., Tusa, F., & Villari, M. a. (2010). Security and Cloud Computing: InterCloud Identity Management Infrastructure. International Workshop on Enabling Technologies: Infrastructures for Collaborative Enterprises in IEEE, pp. 263 – 265. [14]. Artigas, S. (2013). Toward efficient data access privacy in the cloud. International Jouranl of Communications Magazine in IEEE, 55(11), pp. 39–45. [15]. Axaykumar, Patel, A., Patel, H., & Chauhan, M. (2013). Review on OpenID Authentication Framework. International Journal for Scientific Research & Development, 1(2), pp. 2321-0613. [16]. Baldoni, R. (2009). Federated Identity Management Systems in e-Government: the Case of Italy. Electronic Government, An International Journal, pp. 1-20. [17]. Lomte, S., & Dudhani,, S. (2015). Secure Key for Authentication and Secret Sharing in Cloud Computing‖. International Journal of Advanced Research in Computer Science and Software Engineering, 5(6), pp. 1008-1011. [18]. Lonea, A. M., Tianfield, H., & Popescu, D. E. (2013). Identity Management for Cloud

[19]. Madsen, P., Koga, Y., & Takahashi, K. (n.d.). Federated Identity Management for Protecting Users from ID Theft. DIM‘05. Fairfax, Virginia, USA.

Corresponding Author Sonia Narang*

Research Scholar of OPJS University, Churu, Rajasthan