0% found this document useful (0 votes)
37 views

CC

Uploaded by

rahbaralam747
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

CC

Uploaded by

rahbaralam747
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

MODULE-4

7(a)Explain operating system security and virtual machine security.


1. Operating System Security
The Operating System in cloud computing manages hardware resources and provides foundational services
for applications. Ensuring OS security involves protecting applications and data from malicious attacks,
unauthorized access, and tampering.
Key Points:
1. Access Control:
o Permission enforcement-OS enforces mandatory policies that define access permissions for
different system objects.
o User-defined policies-Discretionary policies are avoided as they rely on user-defined rules,
which are prone to errors.
2. Authentication and Cryptography:
o The OS uses robust mechanisms to authenticate users and encrypt data to ensure
confidentiality.
3. Isolation and Trusted Applications:
o Isolating apps-Applications run in unique security domains to minimize the risk of attacks
spreading.
o Trusted applications operate with the least privileges necessary to perform their tasks.
4. Security Challenges:
o Commodity operating systems often lack multilayered security, leading to vulnerabilities.
o Malware and malicious code introduced via Java applets or browser-imported data can exploit
system weaknesses.
5. Trusted Path Mechanisms:
o Protect user interactions with trusted software, ensuring no impersonation or tampering
occurs.
Example:
 A financial application relies on the OS to prevent malicious software from intercepting or altering
sensitive transactions.

2. Virtual Machine Security


Virtual Machines (VMs) are critical components in cloud infrastructure, enabling the abstraction of physical
hardware for multiple tenants.
Key Points:
1. Role of the Virtual Machine Monitor (VMM):
o The VMM enforces resource allocation and isolation between VMs.
o It provides virtual security services, such as memory isolation and secure inter-VM
communication.
2. Security Services:
o VMM-based: Handles resource allocation, prevents side-channel attacks, and ensures
isolation.
o Dedicated Security VM: Manages cryptographic keys and monitors VM behavior to detect
anomalies.
3. Threats:
o VMM-Based Threats:
 Resource starvation due to misconfigured limits.
 Side-channel attacks through shared hardware.
o VM-Based Threats:
 Deployment of rogue or insecure VMs.
 Tampered VM images due to lack of integrity checks.
4. Intrusion Detection and Prevention:
o Techniques like isolation, inspection, and interposition help detect and mitigate attacks.
o VM cloning allows testing of potentially malicious applications in isolated environments.
Example:
 A cloned VM can be used to test a suspicious application’s behavior without risking the primary
environment.

7(b) Explain the security risks posed by shared images and management os.
Security Risks Posed by Shared Images:
1. Backdoors and Leftover Credentials: Malicious AMI creators may leave backdoors, such as unused
public SSH keys or passwords, allowing them to access instances remotely. These leftover credentials
can be exploited to gain unauthorized access.
2. Omission of Cloud-init Scripts: Without regenerating SSH host keys using the cloud-init script,
multiple systems may share identical keys, making them vulnerable to Man-in-the-Middle (MITM)
attacks.
3. Unsolicited Connections: Modified software in shared images can establish unauthorized outgoing
connections, leaking sensitive information like login events or administrative data.
4. Malware: Shared AMIs may contain malicious software like trojans, keyloggers, or spyware, which
can steal credentials, monitor processes, or perform unauthorized actions on the system.
5. Privacy Risks to AMI Creators: Sensitive information such as private keys, shell history, and API
keys left in shared AMIs can be recovered, allowing attackers to exploit cloud resources or redirect
traffic for malicious purposes.
Security Risks Posed by the Management OS (Dom0)
1. Denial-of-Service (DoS) Attacks:The management OS (Dom0) can refuse to execute critical steps
necessary to start or manage a guest VM (DomU), effectively denying services to the guest VM.
2. Kernel Modification:Dom0 can maliciously alter the guest VM's kernel during creation, allowing
unauthorized monitoring or control of applications running within the VM.
3. Memory and Register Manipulation:Dom0 can undermine the VM’s integrity by setting incorrect
page tables, modifying virtual CPU registers, or accessing the VM's memory without permission.
4. Run-Time Interception:During communication between Dom0 and DomU, Dom0 can extract
cryptographic keys or sensitive data from shared memory, jeopardizing confidentiality despite
encryption protocols like TLS.
5. XenStore Exploitation:The entire system state is maintained by XenStore, which Dom0 manages.
A malicious Dom0 or VM can deny access to XenStore for other VMs or exploit it to access
sensitive memory of a DomU, compromising both confidentiality and availability.

8(a) Explain the concept of privacy impact assessment and its importance in cloud computing.
A Privacy Impact Assessment (PIA) is a process designed to evaluate the potential privacy risks of an
information system or project. It involves identifying, assessing, and mitigating privacy concerns during the
lifecycle of the project to ensure compliance with privacy regulations and the protection of personal or
sensitive data.
Key Features of PIA:
1. Proactive Approach: PIA emphasizes a proactive stance toward identifying privacy risks before they
materialize, embedding privacy rules from the beginning.
2. Analysis Scope: It assesses legal, technical, and operational implications related to privacy, ensuring
compliance with standards like the European Directive 95/46/EC on personal data protection.
3. Output: The assessment generates a report, including findings, risk summaries, and suggestions for
security, transparency, and handling cross-border data flows.
Steps in PIA Implementation:
1. Inputs Collection: Inputs such as project information, risk outlines, and stakeholder involvement are
gathered.
2. Knowledge Base Utilization: Domain experts maintain a knowledge base (KB) used to inform the
assessment process.
3. Questionnaire: Users fill in detailed questionnaires to provide relevant information.
4. Rule Execution: An expert system evaluates the inputs and applies the highest-priority privacy rules
to generate actionable insights.
5. Report Generation: The system produces a PIA report summarizing privacy risks and mitigation
strategies
Importance of PIA in Cloud Computing
1. Ensuring Data Privacy in the Cloud:
o Cloud service providers (CSPs) often manage sensitive personal data. A PIA ensures
compliance with privacy laws and mitigates risks such as unauthorized access or misuse of
data stored on CSP servers.
2. Addressing Privacy Challenges in the Digital Age:
o Dynamic provisioning, data proliferation, and lack of user control over data in the cloud make
privacy assessments critical for identifying vulnerabilities.
3. Building Trust with Users:
o Conducting PIAs demonstrates a commitment to transparency and accountability, building
user trust in cloud services.
4. Legal Compliance:
o PIAs help organizations comply with regional privacy regulations, such as GDPR in the EU
or the U.K.-U.S. Safe Harbor framework.
5. Managing Cross-Border Data Flows:
o PIAs assess risks associated with transferring data across jurisdictions with varying privacy
laws, ensuring compliance and reducing legal liabilities.
6. Supporting SaaS-Based Privacy Tools:
o PIAs enable cloud-based tools to integrate privacy assessments, ensuring that emerging
services prioritize privacy protection
8(b) Explain the following associated with cloud computing
i) cloud security risks ii)Security: the top concern for cloud users.
Cloud computing introduces numerous security risks that can compromise the confidentiality, integrity, and
availability of data and services. These risks are broadly categorized into three main areas: traditional security
threats, system availability threats, and third-party data control threats.
1. Traditional Security Threats
These are risks faced by any system connected to the Internet, with amplified impact due to the scale of cloud
environments:
 DDoS Attacks: Distributed Denial-of-Service attacks overwhelm cloud servers, disrupting legitimate
user access.
 Phishing and SQL Injection: Cloud applications are susceptible to phishing attacks and SQL
injection, allowing attackers to manipulate or steal sensitive data.
 Cross-Site Scripting (XSS): Exploits vulnerabilities in cloud-hosted web applications to execute
malicious scripts.
 Insufficient Authentication: Weak or misconfigured authentication processes can lead to
unauthorized access.
2. System Availability Threats
Cloud services can face disruptions due to:
 Service Outages: Power failures, hardware malfunctions, or catastrophic events can halt cloud
services, impacting business operations.
 Resource Exhaustion: Overconsumption of cloud resources by malicious or accidental means can
degrade performance for legitimate users.
 Phase Transition Phenomena: Complex system behaviors in cloud environments may lead to
unexpected failures.
3. Third-Party Data Control Threats
The reliance on external cloud providers introduces risks related to lack of transparency and control:
 Malicious Insiders: Employees or contractors of cloud providers may exploit their access to
compromise user data.
 Insecure APIs: Weak or flawed APIs can expose data during authentication, access control, or
monitoring activities.
 Subcontracting Risks: Providers may use untrusted third parties, leading to data leaks or breaches.
 Cloud Provider Espionage: Proprietary or sensitive data stored in the cloud can be accessed or
misused without user knowledge.
Security: The Top Concern for Cloud Users
Security remains the foremost concern for cloud users, driven by the need to relinquish control over sensitive
information to cloud service providers (CSPs). While users traditionally operate within secure perimeters
protected by corporate firewalls, the shift to cloud computing introduces trust-related challenges. This
transition is critical for leveraging the cost-effectiveness of cloud services but comes with significant security
risks.
Key Concerns:
1. Unauthorized Access and Data Theft:
o Data stored in the cloud is vulnerable, especially during storage, which poses a longer
exposure risk than during processing.
o Threats can arise from flaws in Virtual Machine Managers (VMM), rogue virtual machines
(VMs), or Virtual Machine-Based Rootkits (VMBR).
o Insider threats from rogue CSP employees are a pressing concern due to opaque hiring and
screening policies.
2. Data Lifecycle Control:
o Users lack control over the lifecycle of their data. For instance, ensuring data deletion or
verifying that storage media are wiped clean is nearly impossible.
o Seamless backups by CSPs, done without user consent, increase the risk of data leakage or
unauthorized access.
3. Lack of Standardization:
o The absence of interoperability standards complicates switching between CSPs or accessing
data during service interruptions.
o Users are concerned about the financial and operational costs associated with migrating to
different CSPs.
4. Auditing and Legal Challenges:
o Full audit trails are challenging to maintain in a cloud environment.
o Ambiguity in jurisdiction over data stored across multiple countries complicates enforcing
security and privacy laws.
o CSPs may outsource data handling, creating a chain of third-party dependencies that users
cannot control.
5. Mitigation Strategies:
o Users are advised to evaluate CSP security policies, analyze the sensitivity of their data, and
ensure clear contractual obligations.
o Contracts should address secure data handling, CSP liabilities, data ownership rules, and
geographic storage restrictions.
o Encryption of sensitive data, despite limitations in indexing and searchability, is
recommended to reduce risks.
MODULE-5
9(a)Explain core components of Google app engine
Google App Engine (GAE) is a Platform as a Service (PaaS) that allows developers to build and deploy
applications on Google's infrastructure. The core components of GAE, as depicted in the image, include:
1. Web App: The application code that runs on GAE, which can be built using various programming
languages like Python, Java, Go, etc.
2. Sandboxed Runtime Environment: This environment isolates applications to ensure security and
stability. Each web app runs in its own secure environment, preventing it from affecting other apps or
the system.
3. Google App Engine Infrastructure: This is the backbone of GAE, providing the necessary resources
and services to host and manage web applications, including automatic scaling, load balancing, and
application monitoring.
4. Development: Developers use local machines and SDKs (Software Development Kits), such as the
Python SDK, to develop and test their applications before deploying them to GAE.
5. Runtime: This is the execution environment provided by GAE where the applications run. It manages
application instances, scales resources according to demand, and handles request routing.
6. Services: GAE offers various built-in services that facilitate application development, such as:
o Data Store: A NoSQL database to store and query application data.
o URL Fetch: A service that allows applications to make HTTP requests to external URLs.
o Image Manipulation: A service for resizing, cropping, and manipulating images.
o Task Queue: A service for executing tasks asynchronously or on a schedule outside of a user
request.
o Cron Jobs: A service that triggers tasks at specific times or intervals.

9(b) Discuss in detail the following media applications of cloud computing technologies.
i)Animoto
Animoto is a web-based video creation application that allows users to create professional-looking videos
with minimal effort. Here are the key points about Animoto for 5 marks:
 Easy to Use: Animoto provides a user-friendly interface for creating videos. Users simply select a
theme, upload media (photos, videos, and music), and arrange them in the desired order.
 AI-powered effects: Animoto uses artificial intelligence to automatically select animations and
transition effects, creating a polished video without extensive editing.
 Free and Paid options: Animoto offers a free tier that allows users to create 30-second videos. Paid
subscriptions allow for longer videos and more customization options.
 Scalable cloud infrastructure: Animoto leverages Amazon Web Services (AWS) for its
infrastructure. This allows Animoto to scale up or down based on user demand, ensuring high
availability and reliability.
ii) Maya Rendering with Aneka
Maya rendering with Aneka, as implemented by GoFront, uses a private cloud to accelerate 3D model
rendering. Here's a breakdown for 5 marks:
 Design and Rendering Workflow: GoFront engineers design train models using Maya, a 3D
modeling software. High-quality 3D renderings are crucial for design analysis and improvement.
However, rendering these complex models is computationally intensive and time-consuming.
 Aneka Private Cloud: GoFront utilizes Aneka, a platform for building private clouds, to transform
their existing network of desktop computers into a distributed rendering system. This creates a private
cloud environment within their local network.
 Task Distribution: Engineers use a specialized client interface to define rendering parameters
(number of frames, cameras, etc.) and submit rendering tasks to the Aneka cloud. Aneka then
distributes these tasks across the available desktop computers in the network.
 Parallel Processing: Each desktop in the network acts as a worker node, running Maya's batch
renderer to process a portion of the overall rendering task. This parallel processing significantly
reduces the total rendering time.
 Result Aggregation and Time Savings: Once the individual rendering tasks are complete, Aneka
collects the results from each worker node and combines them into the final rendered output.
iii)Video encoding on cloud: encoding.com
Cloud video encoding is the process of converting raw video data into a more compact and streamable digital
format using cloud-based services. This process is crucial for making videos compatible across various
devices, web browsers, and streaming platforms. Here are the key aspects of cloud video encoding:
 Process of Encoding:
 Compression: Reduces file size while maintaining quality for efficient storage and transmission.
 Format Conversion: Converts RAW video to formats like MP4 or WebM for compatibility.
 Benefits of Cloud Video Encoding:
 Scalability: Handles workloads of all sizes without significant hardware investment.
 Speed: Optimized cloud infrastructure enables faster encoding.
 Accessibility: Access encoded videos from anywhere for easy management and distribution.
 Cost-Effectiveness: Pay-as-you-go models reduce costs by avoiding expensive hardware.
 Cloud Video Encoding Services:
 Mux: Uses machine learning to optimize video quality for on-demand and live streaming.
 Google Cloud Transcoder API: Manages transcoding jobs with adaptive bitrate and encryption.
 Vidispine: Provides scalable, pay-as-you-use video encoding solutions.
10(a) Explain in detail about the application of cloud computing in
i)Healthcare: ECG analysis in the cloud

The Online Health Monitoring System Based in the Cloud, as depicted in the provided diagram,
utilizes cloud computing technologies to monitor and analyze health data, specifically focusing on
ECG (Electrocardiogram) data. Here's the breakdown of the system based on the image:

1. ECG Sensor Module (User Side)

 Components:
o The system begins with the user wearing an ECG sensor module, which includes an embedded
Bluetooth-enabled data communication & processor module.
 Functionality:
o The ECG sensor collects real-time health data (like heart rate patterns) from the user's body.
o Data is transmitted via Bluetooth connectivity to a mobile device or gateway.

2. Mobile Device
 The collected ECG data is sent to the user's mobile device, which acts as an intermediate layer for further
communication.
 The mobile device forwards this data using wireless or mobile 3G network connectivity.

3. Cloud-Based Architecture

 IaaS (Infrastructure as a Service):


o Includes cloud infrastructure provided by services like Amazon Web Services (AWS) and S3
(Simple Storage Service) for storage and processing.
o Provides scalability and reliability for storing large volumes of user data.
 PaaS (Platform as a Service):
o The dynamically scalable runtime ensures that the system can handle multiple user requests
efficiently.
o Includes a security runtime to ensure data confidentiality and privacy during transmission and
storage.
 SaaS (Software as a Service):
o Provides ECG Data Analysis as a Service, where health data is processed and analyzed to generate
insights.
o Offers visualization and analytics to doctors or healthcare providers.

4. User Requests & Results

 A large number of users can simultaneously send data to the cloud platform.
 Healthcare providers, doctors, or patients can access the analyzed data via user-friendly applications or
dashboards.

5. ECG Analysis & Output

 The system processes ECG data and generates visualizations, such as graphs and alerts, for abnormal patterns
or critical conditions.
 The results are used for proactive health monitoring and medical interventions.

ii)Geoscience: satellite image processing

The Cloud Environment for Satellite Data Processing, as shown in the diagram, outlines an
infrastructure for managing and processing satellite data using a combination of local storage,
private and public clouds, and a software framework for distribution and visualization. Here’s the
explanation based on the image:

1. Data Collection (Satellite and Local Storage)

 Satellite:
o A satellite captures Earth data, which is transmitted to a ground station.
 Local Storage:
o The received data is temporarily stored in local storage systems for initial processing and archiving.

2. Archiving

 The data from local storage is archived for long-term storage and processing. This ensures that raw satellite
data is preserved for further use in various applications.

3. Aneka Framework

 Role:
o Aneka is a middleware framework used for managing the processing and distribution of satellite data.
o It facilitates resource provisioning and task scheduling across the cloud infrastructure.
 Integration:
o Aneka acts as a bridge between local storage and both private and public cloud environments.

4. Cloud Infrastructure

 Private Cloud:
o Satellite data processing begins in a private cloud (e.g., using Xen Server). The private cloud
provides a secure environment for processing sensitive or proprietary data.
 Public Cloud:
o The processed data can also be transferred to a public cloud, such as Amazon Web Services (AWS),
for scalability, cost-effectiveness, and global distribution.

5. Portal (SaaS)

 The final processed data is visualized and made accessible to users via a Portal (SaaS).
 This portal allows users to interact with satellite data, view maps, analyze geographical information, and
extract meaningful insights.

6. Distribution

 Processed satellite data is distributed globally to authorized users through the SaaS portal. This ensures
seamless access for research, development, or decision-making.

Key Benefits

 Scalability: Public cloud (AWS) ensures scalable resources for storing and distributing large amounts of
satellite data.
 Security: Private cloud ensures secure processing of sensitive data.
 User Accessibility: SaaS-based portal provides user-friendly access to processed data.
 Cost Efficiency: Leveraging both private and public clouds optimizes costs while maintaining efficiency.

10(b) Explain Amazon web services(AWS) in detail.

1. Foundational Concept: Cloud Computing and AWS

 AWS Platform: Provides on-demand IT resources, offering scalability, cost-efficiency, and flexibility.

 Compute Services:

 EC2: Customizable virtual machines for running applications.


 EMR: Managed Hadoop for big data processing.
 Elastic Beanstalk: Simplifies app deployment and management.
 CloudFormation: Enables infrastructure as code.
 Autoscaling: Adjusts EC2 instances based on demand.
 Storage Services:

 S3: Scalable object storage for any data type.


 EBS: Persistent block storage for EC2 instances.
 ElastiCache: In-memory caching for improved performance.
 SimpleDB: NoSQL database for structured data.
 RDS: Managed relational databases supporting various engines.
 CloudFront: CDN for fast content delivery.
 Import/Export: Physical data transfer services.

 Communication Services:

 SQS: Asynchronous message queuing.


 SNS: Publish/subscribe messaging service.
 SES: Cloud-based email sending.
 Route 53: Scalable DNS web service.
 VPC: Isolated virtual networks for secure cloud environment.
 Direct Connect: Dedicated network connection to AWS.
 Elastic Load Balancing: Distributes traffic across EC2 instances.

 Additional Services:

 GovCloud: AWS region for US government agencies.


 CloudWatch: Monitors AWS resources and applications.
 FPS: Online payment service for applications.
 DevPay: Billing and account management for app developers.
 FWS: Integrates with Amazon's fulfillment for orders.
 Mechanical Turk: Marketplace for human intelligence tasks.
 Alexa Services: Provides website data and analytics.

 Pay-as-you-go Model: Charges based on resource usage, no upfront hardware investment.

 Scalability & Elasticity: Easily scale resources to match demand with autoscaling.

Global Infrastructure: Deploy applications globally for better performance and reduced latency.

You might also like