Mobile Operating System Unit 5
Mobile Operating System Unit 5
1. Android OS
The Android OS is the most common operating system among the mobile operating system. Furthermore, Google is the developer
of Android. Moreover, it is an open source and free operating system. This OS is based on the Linux kernel. The name for every
new version of update is based on ‘desserts’ for example Cupcake, Donut, Eclair, Oreo, Kitkat, etc.
2. Bada
Samsung is the launcher of this operating system. It came into market in 2010. Moreover, it includes features like 3-D graphics,
application installation, multipoint touch etc.
3. Blackberry OS
The developer of this operating system is Research In Motion (RIM). It was specifically designed for blackberry devices.
Furthermore, it is useful for corporate users.
4. Apple iOS
After android, it is one of the most popular OS. It is designed to run on Apple devices such as iPhones, iPad tablets, etc. Moreover,
like the android devices have the playstore for apps download. Likewise, apple iOS contains the app store. Also, it has very strong
security features.
The developer of this OS is Microsoft. It is basically designed for pocket PCs and smartphones. Moreover, it has the features of
computer based Windows OS and additional features for mobile phones.
6. Symbian OS
Symbian Ltd. is the developer of this OS. Moreover, Nokia was the first to use this OS on its mobile phones. Furthermore, it
provides high level integration with communication. This OS is based on java language.
7. Harmony OS
It is a latest OS moreover, Huawei is its developer. It is specifically designed for use in IoT devices.
8. Palm OS
Its other name is Garnet OS. Furthermore, Palm Ltd. is its developer which developed this OS for use in Personal Digital
Assisstants (PADs).
9. WebOS
Palm Ltd is its developer. Moreover, it is based on Linux kernel and HP uses it in its mobile devices and touchpads.
1. Easy to use
An operating system should focus on controlling the data and network usage. It should keep the limit and requirement in
focus.
Secondly, the organization of data related to to-do lists, calendars, alarms, reminders etc is very important. A good OS should
keep this data in a very organized and safe manner. Moreover, the data should be readily and easily available.
The components of a mobile OS are same as a basic OS. The components are as follows:
1. Kernel
A kernel is the core/heart of an OS. It contains all the functions and operations to manage the working of OS.
2. Process Execution
The OS executes various process so that the statements will execute and connect the application program to the hardware.
Whenever a process executes it uses memory, space and other resources as well.
3. Interrupt
Interrupts are basically used be the hardware devices to communicate with the CPU. It is basically a signal which the device
generates to request the CPU. Moreover, whenever an interrupt occurs the CPU temporarily stops executing its current process.
4. Memory Management
It is the management of the main or primary memory. Furthermore, whatever program is executed, it has to be present in the main
memory. Therefore, there can be more than one program present at a time. Hence, it is required to manage the memory.
5. Multitasking
It is performing more than one tasks at a time. The OS allows the user to work wit more than one process at a time without any
problem.
6. Security
The OS keeps the system and programs safe and secure through authentication. A user id and password decide the authenticity of
the user.
7. User Interface
GUI stands for Graphical User Interface. As the name suggests, it provides a graphical interface for the user to interact with the
computer. It uses icons, menus, etc. to interact with the user. Moreover, the user can easily interact by just clicking these items.
Therefore, it is very user-friendly and there is no need to remember any commands.
The most sophisticated examples of wearable technology include artificial intelligence (AI) hearing aids, Google Glass and
Microsoft's HoloLens, and a holographic computer in the form of a virtual reality (VR) headset. An example of a less complex
form of wearable technology is a disposable skin patch with sensors that transmit patient data wirelessly to a control device in a
healthcare facility.
Wearables are embedded with built-in sensors that keep track of bodily movements, provide biometric identification or assist
with location tracking. For example, activity trackers or smartwatches -- the most common types of wearables -- come with a
strap that wraps around the user's wrist to monitor their physical activities or vitals throughout the day.
While most wearables are either worn on the body or are attached to clothing, some function without any physical contact with
the user. Cell phones, smart tags or computers can still be carried around and track user movements. Other wearables use
remote smart sensors and accelerometers to track movements and speed, and some use optical sensors for measuring heart rate
or glucose levels. A common factor among these technology wearables is the fact they all monitor data in real time.
The following are the most popular current and next-generation applications of wearable technology:
Epidermal skin technology. According to ScienceDaily, the Terasaki Institute for Biomedical Innovation invented wearable
electronic skin for monitoring health. A next-generation of wearables, this ultra-thin e-skin patch can be attached to the wearer's
chest area along with a small wireless transmitter by using water spray and can be worn for up to a week. It is sensitive enough
to pick up and record electro signals, such as heartbeats and muscle movements, which can be sent to healthcare providers via
the cloud so they can monitor the user's vitals remotely. This powerful wearable is a steppingstone for monitoring chronic
illnesses such as heart failure and diabetes.
Health monitoring. People use wearable technology to track and receive notifications for their heart rate and blood pressure,
watch their calorie intake or manage their training regimens. The COVID-19 pandemic boosted the use of wearable technology,
as consumers gained a broader awareness of personal hygiene and taking precautions to prevent the spread of infections. Apple,
for instance, updated its Cardiogram app by introducing a new sleeping beats-per-minute feature that monitors heart rate
fluctuations for COVID-19 patients.
Entertainment and gaming. The gaming and entertainment industries were the first to adopt VR headsets, smart glasses and
controllers. Popular VR head-mounted displays, such as Oculus Quest, Meta Quest and Sony PlayStation VR, are used for all
types of entertainment purposes, including gaming, watching movies and virtual traveling.
Fashion and smart clothing. Clothing known as smart clothing, or intelligent fashion, has been gaining wide popularity over
the past few years. Smart jackets, such as Levi's jacket made with Google's Project Jacquard technology whose threads are
composed of electrical fibers, enable the wearer to answer calls, play music or take photos right from their sleeves.
Smartwatches, wristbands, smart shoes and smart jewelry are also popular examples of wearable technology.
Military. These wearables include technology that tracks soldiers' vitals, VR-based simulation exercises and sustainability
technology, such as boot inserts that estimate how well the soldiers are holding their equipment weight and how terrain factors
can affect their performance.
Sports and fitness. Sports use wearable athletic devices that are either built into the fabric of the sports apparel or are
incorporated into sports equipment, such as bats and balls. The GPS and Bluetooth-linked devices relay real-time data to
coaches for analysis through connected electronic devices such as laptops. Besides wearable athletic devices, familiar wearable
technology such as Fitbit, Apple Watch, Garmin, Samsung Galaxy Watch, and Polar are used extensively to track various areas
of the player's health and performance metrics.
The following is a brief history showcasing the various turns wearable technology has taken over time:
1960s. In 1961, Edward Thorp and Claude Shannon created wearable technology in the form of a tiny four-button computer
that could fit into a shoe or be strapped around the user's waist. It was created to help gamblers in casinos cheat at roulette
games, as the computer acted as a timing device to predict where the ball would land.
1970s. Wearable tech gained popularity during this decade. The first calculator wristwatch was released in 1975 by Pulsar
and quickly became a fashion statement, as many celebrities, including Police lead singer Sting, were seen wearing it. Other
companies, including Casio, released watches well into the 80s and Marty McFly was seen wearing the Casio CA53W
calculator watch in the movie Back to the Future.
1980s. Sony released the Walkman in 1979 and it became the most popular wearable music device throughout the 80s. The
healthcare industry was also transformed during this decade with the release of the first digital hearing aids in 1987.
1990s. Steve Mann, a Canadian researcher, invented the wearable wireless webcam in 1994. This bulky webcam facilitated
the use of future IoT technologies. Smart clothing expos and wearable technology conferences also spiked in popularity
during the 90s.
2000s. This decade saw an explosion in wearable technology with the introduction of Bluetooth headsets, Fitbits and the
Nike plus iPod Sport Kit.
2010s. This period was the tipping point for wearable technology. Google Glass entered the scene in 2013, while the Apple
Watch debuted in 2015 and was followed by The Oculus Rift Headset in 2016.
2020s. The gaming industry continues to add newer AR and VR headsets, while clothing designers are rapidly bringing
smart clothing to the mainstream.
The future of wearable technology
Wearable technology is becoming increasingly popular and is all set to revolutionize the future. While fitness trackers, smart
devices, intelligent clothing and VR and AR headsets have gained widespread approval, they are only the tip of the iceberg.
The following are some futuristic products and concepts predicted by tech experts and how they will shape wearable technology
going forward:
Apple Glasses. Initial reports from Bloomberg and The Information suggest that Apple Glasses could be released by 2023.
These AR smart glasses are designed to transfer information from a user's phone to their face. These glasses will be able to
synchronize with a wearer's iPhone to display texts, emails, games and other items over the user's field of vision.
Energy harvesting. One drawback of using wearable technology is that it must be taken off for regular charging. Energy
harvesting is being researched and could prolong battery life by converting body heat, movement or solar energy into power.
Piezoelectricity is one example of energy harvesting where piezoelectric ceramic can be used to convert the body vibrations
produced during movement into energy.
Smart contact lenses. Nothing short of a sci-fi movie, smart contact lenses that can deliver real-time information to the human
eye will be available to consumers soon. Tech giants, including Google, Mojo Vision, Samsung and Sony, are working on
developing these soft electronic smart contact lenses that can sync up with smartphones or other external devices to provide
real-time, hands-free information along with vision correction.
AI for the human brain. AI-integrated non-invasive sensors that help with performing functions associated with thinking are
currently being developed. Facebook is developing a brain-computer interface that could enable people to type Facebook
status updates by using their minds instead of typing. Elon Musk's company Neuralink is also working on an interface that could
help people who suffer from traumatic brain injuries.
Cloud Computing
Cloud computing service models are based on the concept of sharing on-demand computing resources, software, and
information over the internet. Companies or individuals pay to access a virtual pool of shared resources, including compute,
storage, and networking services, which are located on remote servers that are owned and managed by service providers.
One of the many advantages of cloud computing is that you only pay for what you use. This allows organizations to scale faster
and more efficiently without the burden of having to buy and maintain their own physical data centers and servers.
In simpler terms, cloud computing uses a network (most often, the internet) to connect users to a cloud platform where they
request and access rented computing services. A central server handles all the communication between client devices and
servers to facilitate the exchange of data. Security and privacy features are common components to keep this information secure
and safe.
When adopting cloud computing architecture, there is no one-size-fits-all. What works for another company may not suit you
and your business needs. In fact, this flexibility and versatility is one of the hallmarks of cloud, allowing enterprises to quickly
adapt to changing markets or metrics.
There are three different cloud computing deployment models: public cloud, private cloud, and hybrid cloud.
Types of cloud computing deployment models
Public cloud
Public clouds are run by third-party cloud service providers. They offer compute, storage, and network resources over the
internet, enabling companies to access shared on-demand resources based on their unique requirements and business goals.
Private cloud
Private clouds are built, managed, and owned by a single organization and privately hosted in their own data centers, commonly
known as “on-premises” or “on-prem.” They provide greater control, security, and management of data while still enabling
internal users to benefit from a shared pool of compute, storage, and network resources.
Hybrid cloud
Hybrid clouds combine public and private cloud models, allowing companies to leverage public cloud services and maintain the
security and compliance capabilities commonly found in private cloud architectures.
What are the types of cloud computing services?
There are three main types of cloud computing service models that you can select based on the level of control, flexibility, and
management your business needs:
Infrastructure as a service (IaaS) offers on-demand access to IT infrastructure services, including compute, storage, networking,
and virtualization. It provides the highest level of control over your IT resources and most closely resembles traditional on-
premises IT resources.
Platform as a service (PaaS) offers all the hardware and software resources needed for cloud application development. With
PaaS, companies can focus fully on application development without the burden of managing and maintaining the underlying
infrastructure.
Software as a service (SaaS) delivers a full application stack as a service, from underlying infrastructure to maintenance and
updates to the app software itself. A SaaS solution is often an end-user application, where both the service and the infrastructure
is managed and maintained by the cloud service provider.
What are the benefits of cloud computing?
It’s flexible
Due to the architecture of cloud computing, enterprises and their users can access cloud services from anywhere with an
internet connection, scaling services up or down as needed.
It’s efficient
Enterprises can develop new applications and rapidly get them into production—without worrying about the underlying
infrastructure.
It offers strategic value
Because cloud providers stay on top of the latest innovations and offer them as services to customers, enterprises can get
more competitive advantages—and a higher return on investment—than if they’d invested in soon-to-be obsolete
technologies.
It’s secure
Enterprises often ask, What are the security risks of cloud computing? They are considered relatively low. Cloud computing
security is generally recognized as stronger than that in enterprise data centers, because of the depth and breadth of the
security mechanisms cloud providers put into place. Plus, cloud providers’ security teams are known as top experts in the
field.
It’s cost-effective
Whatever cloud computing service model is used, enterprises only pay for the computing resources they use. They don’t need
to overbuild data center capacity to handle unexpected spikes in demand or business growth, and they can deploy IT staff to
work on more strategic initiatives.
Virtualization
Virtualization is a process that allows for more efficient utilization of physical computer hardware and is the foundation of
cloud computing.
Virtualization uses software to create an abstraction layer over computer hardware that allows the hardware elements of a single
computer—processors, memory, storage and more—to be divided into multiple virtual computers, commonly called virtual
machines (VMs). Each VM runs its own operating system (OS) and behaves like an independent computer, even though it is
running on just a portion of the actual underlying computer hardware.
It follows that virtualization enables more efficient utilization of physical computer hardware and allows a greater return on an
organization’s hardware investment.
Today, virtualization is a standard practice in enterprise IT architecture. It is also the technology that drives cloud
computing economics. Virtualization enables cloud providers to serve users with their existing physical computer hardware; it
enables cloud users to purchase only the computing resources they need when they need it, and to scale those resources cost-
effectively as their workloads grow.
Benefits of virtualization
Virtualization brings several benefits to data center operators and service providers:
Resource efficiency: Before virtualization, each application server required its own dedicated physical CPU—IT staff
would purchase and configure a separate server for each application they wanted to run. (IT preferred one application and
one operating system (OS) per computer for reliability reasons.) Invariably, each physical server would be underused. In
contrast, server virtualization lets you run several applications—each on its own VM with its own OS—on a single
physical computer (typically an x86 server) without sacrificing reliability. This enables maximum utilization of the
physical hardware’s computing capacity.
Easier management: Replacing physical computers with software-defined VMs makes it easier to use and manage
policies written in software. This allows you to create automated IT service management workflows. For example,
automated deployment and configuration tools enable administrators to define collections of virtual machines and
applications as services, in software templates. This means that they can install those services repeatedly and consistently
without cumbersome, time-consuming. and error-prone manual setup. Admins can use virtualization security policies to
mandate certain security configurations based on the role of the virtual machine. Policies can even increase resource
efficiency by retiring unused virtual machines to save on space and computing power.
Minimal downtime: OS and application crashes can cause downtime and disrupt user productivity. Admins can run
multiple redundant virtual machines alongside each other and failover between them when problems arise. Running
multiple redundant physical servers is more expensive.
Faster provisioning: Buying, installing, and configuring hardware for each application is time-consuming. Provided that
the hardware is already in place, provisioning virtual machines to run all your applications is significantly faster. You can
even automate it using management software and build it into existing workflows.
Types of virtualization
Desktop virtualization
Network virtualization
Storage virtualization
Data virtualization
Application virtualization
Data center virtualization
CPU virtualization
GPU virtualization
Linux virtualization
Cloud virtualization
Desktop virtualization
Desktop virtualization lets you run multiple desktop operating systems, each in its own VM on the same computer.
Virtual desktop infrastructure (VDI) runs multiple desktops in VMs on a central server and streams them to users who
log in on thin client devices. In this way, VDI lets an organization provide its users access to variety of OS's from any
device, without installing OS's on any device.
Local desktop virtualization runs a hypervisor on a local computer, enabling the user to run one or more additional OSs
on that computer and switch from one OS to another as needed without changing anything about the primary OS
Network virtualization
Network virtualization uses software to create a “view” of the network that an administrator can use to manage the network
from a single console. It abstracts hardware elements and functions (e.g., connections, switches, routers, etc.) and abstracts them
into software running on a hypervisor. The network administrator can modify and control these elements without touching the
underlying physical components, which dramatically simplifies network management.
Types of network virtualization include software-defined networking (SDN), which virtualizes hardware that controls network
traffic routing (called the “control plane”), and network function virtualization (NFV), which virtualizes one or more
hardware appliances that provide a specific network function (e.g., a firewall, load balancer, or traffic analyzer), making those
appliances easier to configure, provision, and manage.
Storage virtualization
Storage virtualization enables all the storage devices on the network— whether they’re installed on individual servers or
standalone storage units—to be accessed and managed as a single storage device. Specifically, storage virtualization masses all
blocks of storage into a single shared pool from which they can be assigned to any VM on the network as needed. Storage
virtualization makes it easier to provision storage for VMs and makes maximum use of all available storage on the network.
Data virtualization
Modern enterprises store data from multiple applications, using multiple file formats, in multiple locations, ranging from the
cloud to on-premise hardware and software systems. Data virtualization lets any application access all of that data—irrespective
of source, format, or location.
Data virtualization tools create a software layer between the applications accessing the data and the systems storing it. The layer
translates an application’s data request or query as needed and returns results that can span multiple systems. Data virtualization
can help break down data silos when other types of integration aren’t feasible, desirable, or affordable.
Application virtualization
Application virtualization runs application software without installing it directly on the user’s OS. This differs from complete
desktop virtualization (mentioned above) because only the application runs in a virtual environment—the OS on the end user’s
device runs as usual. There are three types of application virtualization:
Local application virtualization: The entire application runs on the endpoint device but runs in a runtime environment
instead of on the native hardware.
Application streaming: The application lives on a server which sends small components of the software to run on the
end user's device when needed.
Server-based application virtualization The application runs entirely on a server that sends only its user interface to the
client device.
Data center virtualization
Data center virtualization abstracts most of a data center’s hardware into software, effectively enabling an administrator to
divide a single physical data center into multiple virtual data centers for different clients.
Each client can access its own infrastructure as a service (IaaS), which would run on the same underlying physical hardware.
Virtual data centers offer an easy on-ramp into cloud-based computing, letting a company quickly set up a complete data center
environment without purchasing infrastructure hardware.
CPU virtualization
CPU (central processing unit) virtualization is the fundamental technology that makes hypervisors, virtual machines, and
operating systems possible. It allows a single CPU to be divided into multiple virtual CPUs for use by multiple VMs.
At first, CPU virtualization was entirely software-defined, but many of today’s processors include extended instruction sets that
support CPU virtualization, which improves VM performance.
GPU virtualization
A GPU (graphical processing unit) is a special multi-core processor that improves overall computing performance by taking
over heavy-duty graphic or mathematical processing. GPU virtualization lets multiple VMs use all or some of a single GPU’s
processing power for faster video, artificial intelligence (AI), and other graphic- or math-intensive applications.
Pass-through GPUs make the entire GPU available to a single guest OS.
Shared vGPUs divide physical GPU cores among several virtual GPUs (vGPUs) for use by server-based VMs.
Linux virtualization
Linux includes its own hypervisor, called the kernel-based virtual machine (KVM), which supports Intel and AMD’s
virtualization processor extensions so you can create x86-based VMs from within a Linux host OS.
As an open source OS, Linux is highly customizable. You can create VMs running versions of Linux tailored for specific
workloads or security-hardened versions for more sensitive applications.
Cloud virtualization
As noted above, the cloud computing model depends on virtualization. By virtualizing servers, storage, and other physical data
center resources, cloud computing providers can offer a range of services to customers, including the following:
Infrastructure as a service (IaaS): Virtualized server, storage, and network resources you can configure based on their
requirements.
Platform as a service (PaaS): Virtualized development tools, databases, and other cloud-based services you can use to
build you own cloud-based applications and solutions.
Software as a service (SaaS): Software applications you use on the cloud. SaaS is the cloud-based service most
abstracted from the hardware.