AI Analyst Question And Answers
AI Analyst Question And Answers
Short questions:-
1. What is AI ? who proposed the concept of AI?
A.) The term AI was introduced by Prof. John McCarthy at a conference at Dartmouth College
in 1956. McCarthy defines AI as the “science and engineering of making intelligent machines,
especially intelligent computer programs”.
(Strong AI)e.g.,:- DeepMind, the Human Brain Project (an academic project that is based
in Lausanne, Switzerland), and OpenAI.
(Weak AI) e.g.,:- Chatbot
11. Name some of the computer architecture processors developed for AI?
A.) In recent years, big data and the ability to process a large amount of data at high speeds have
enabled researchers and developers to access and work with massive sets of data. Processing
speeds and new computer chip architectures contribute to the rapid evolution of AI applications.
Large manufacturers of computer chips such as IBM and Intel are prototyping “brain-like” chips
whose architecture is configured to mimic the biological brain’s network of neurons and the
connections between the called synapses.
IBM AI Research
Intel’s New Self-Learning Chip Promises to Accelerate Artificial Intelligence
Artificial Intelligence
12. Name some of the cloud computing platforms using for AI research purposes?
A.) All the significant companies in the AI services market deliver their services and tools on the
internet through APIs over cloud platforms, for example:
IBM delivers Watson AI services over IBM Cloud.
Amazon AI services are delivered over Amazon Web Services (AWS).
Microsoft AI tools are available over the MS Azure cloud.
Google AI services are available in the Google Cloud Platform.
These services benefit from cloud platform capabilities, such as availability, scalability,
accessibility, rapid deployment, flexible billing options, simpler operations, and
management.
In this example, the data is pulled from Twitter, Facebook, messaging applications, and
websites, and then the data is pulled into a Hadoop cluster. Hadoop can ingest large volumes
of unstructured data. A subset of the data (after cleaning) which now might be structured, is
pulled into a data warehouse that is like the one that is described in Example 1. The
reporting data that is stored in the data warehouse is accessed by the reporting tools to
visualize the data through dashboards and graphs.
Artificial Intelligence
Long Questions:-
Traditional Analytics
This example is for large structured implementations that support a large demand, as shown
in following Figure.
This architecture is rigid, so getting responses quickly can be difficult. The data should be
guaranteed to have quality and be secure. There should be some sort of control and
governance in place to make sure that the data is seen only by the people that are
authorized to access the data.
In this architecture, data is pulled from transactional systems, such as a supply chain, points
of sale, or finance systems. The data is extracted, transformed, and loaded (by using ETL
tools) into a data warehouse, where the reporting data is stored. Then, reporting and
analytics tools point to the reporting data to visualize the data through dashboards and
graphs.
In this example, the data is pulled from Twitter, Facebook, messaging applications, and
websites, and then the data is pulled into a Hadoop cluster. Hadoop can ingest large volumes
of unstructured data. A subset of the data (after cleaning) which now might be structured, is
pulled into a data warehouse that is like the one that is described in Example 1. The
reporting data that is stored in the data warehouse is accessed by the reporting tools to
visualize the data through dashboards and graphs.
2. What is IBM Watson? Explain the Use cases of IBM Watson AI in detail?
A.) IBM Watson is a cognitive system that enables a new partnership between people and
computers. It is the AI offering from IBM. Watson combines five core capabilities:
Use cases:-
1. OmniEarth : OmniEarth Inc. builds scalable solutions for processing, clarifying, and fusing
large amounts of satellite and aerial imagery with other data sets.
Implementation:-
OmniEarth uses IBM Watson to identify topographical features in satellite images, giving
water districts insight into dynamic patterns of water consumption and weather. OmniEarth
is using the solution to develop water conservation strategies for drought-stricken areas.
OmniEarth is helping water utilities within the State of California to analyze aerial images to
monitor water consumption on each parcel of land across the state.
Artificial Intelligence
2. Woodside Energy: Woodside Energy is Australia’s largest publicly traded oil and gas
exploration and production company and one of the nation’s most successful explorers,
developers, and producers of oil and gas.
Implemenation : Working with Watson, Woodside Energy built a customized tool that
allowed its employees to find detailed answers to highly specific questions, even on remote
oil and gas facilities (Figure 4). Watson ingested the equivalent of 38,000 Woodside
documents, which would take a human over five years to read.
• Over 38,000 Woodside documents were loaded
into Watson on IBM Cloud, the equivalent of 30
years of practical engineering experience.
• With this data, Watson considers historical
context and procedural information on
operations, equipment, weather, tidal currents
and more.
This section discusses key capabilities that had to be developed to make question-answering
systems practical in real-world applications (and why the DeepQA architecture had to
evolve).
At a high level, DeepQA generates and scores many hypotheses by using an extensible
collection of natural language processing, machine learning, and reasoning algorithms, which
gather and weigh evidence over both unstructured and structured content to determine the
answer with the best confidence.
The primary computational principle supported by DeepQA was to assume and pursue
multiple interpretations of the question, to generate many plausible answers or hypotheses,
and to collect and evaluate many competing evidence paths that might support or refute
those hypotheses (see Figure 5).
Artificial Intelligence
To understand the future of AI, placing it in the historical context is important. To date, two
distinct eras of computing have occurred: the tabulating era and the programming era. We are
entering the third and most transformational era in computing’s evolution, the AI computing era
(also known as the cognitive era).
6. Explain how Mapping human thinking to artificial intelligence components to Improve AI?
A.) Mapping human thinking to artificial intelligence components:
Because AI is the science of simulating human thinking, it is possible to map the human
thinking stages to the layers or components of AI systems.
In the first stage, humans acquire information from their surrounding environments through
human senses, such as sight, hearing, smell, taste, and touch, through human organs, such
as eyes, ears, and other sensing organs, for example, the hands.
In AI models, this stage is represented by the sensing layer, which perceives information from
the surrounding environment. This information is specific to the AI application. For example,
there are sensing agents such as voice recognition for sensing voice and visual imaging
recognition for sensing images. Thus, these agents or sensors take the role of the hearing
and sight senses in humans.
The second stage is related to interpreting and evaluating the input data. In AI, this stage is
represented by the interpretation layer, that is, reasoning and thinking about the gathered
input that is acquired by the sensing layer.
The third stage is related to taking action or making decisions. After evaluating the input
data, the interacting layer performs the necessary tasks. Robotic movement control and
speech generation are examples of functions that are implemented in the interacting layer.
These systems, and any AI system in general, are not built on a single component. They are
composed of multiple components and approaches for thinking and reasoning, and
simulating human thinking. These combinations represent the identity of the AI system and
distinguish one AI system from others.
Humans are constantly observing what is happening around them, what it means, and what
will happen next. For any intelligent system, including human beings, the ability to assess
and understand what is happening right now, make inferences about it, and predict what is
going to happen next is essential for assessing current situations and predicting the
convenient or appropriate action or decision.
For AI systems to be able to assess data, there must be enough data with which to work.
Using a large volume of data can support the learning process of even a weak AI algorithm to
identify the relevant data and discard the “noise” or irrelevant data. Availability, variety, and
Artificial Intelligence
volume of data are factors that contribute to enhancing the learning process and produce
more accurate results.
Many AI systems focus on prediction. To predict an outcome, AI systems use data mining or
machine learning algorithms. There are different techniques that are used for prediction,
such as regression analysis, which is a set of statistical processes for estimating the
relationship among variables. AI systems build rules that can be used to predict the events of
interest before they take place.
According to the type of predicted outcome, AI systems can be categorized into the following
two categories:
Deterministic systems
Probabilistic systems
In general, application programming interfaces (APIs) expose capabilities and services. APIs
enable software components to communicate with each other easily. The use of APIs as a
method for integration injects a level of flexibility into the application lifecycle by making the task
easier to connect and interface with other applications or services. APIs abstract the underlying
workings of a service, application, or tool, and expose only what a developer needs, so
programming becomes easier and faster.
AI APIs are usually delivered on an open cloud-based platform on which developers can infuse AI
capabilities into digital applications, products, and operations by using one or more of the
available APIs.
All the significant companies in the AI services market deliver their services and tools on the
internet through APIs over cloud platforms, for example: