Nothing Special   »   [go: up one dir, main page]

Artificial Intelligence

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 22

Artificial Intelligence

Chapter 01
WHAT IS ARTIFICIAL INTELLIGENCE
AI is intelligence demonstrated by machines, as
opposed to the natural intelligence displayed by
humans or animals.
Artificial intelligence is a wide-ranging branch of
computer science concerned with building smart
machines capable of performing tasks that typically
require human intelligence.
AI is the creation of software that imitates human
behaviours and capabilities.
Key workloads include:
◦ Machine learning - This is often the foundation for an AI
system, and is the way we "teach" a computer model to
make prediction and draw conclusions from data.
◦ Anomaly detection - The capability to automatically detect
errors or unusual activity in a system.
◦ Computer vision - The capability of software to interpret
the world visually through cameras, video, and images.
◦ Natural language processing - The capability for a computer to
interpret written or spoken language, and respond in kind.
◦ Knowledge mining - The capability to extract information from
large volumes of often unstructured data to create a searchable
knowledge store.
AZURE:

◦ Microsoft Azure, often referred to as Azure is platform


operated by Microsoft for application management via
Microsoft-managed data centers.
◦ Microsoft Azure has multiple capabilities such as software
as a service (SaaS), platform as a service
(PaaS) and infrastructure as a service (IaaS) and supports
many different programming languages, tools, and
frameworks, including both Microsoft-specific and third-
party software and systems.
Why Azure for AI?
Build on your terms
◦ Access frameworks, tools, and capabilities for
developers and data scientists of any skill level.
Deploy mission-critical AI solutions
◦ Use the same proven AI services that power AI
capabilities in Xbox, HoloLens, and Microsoft
Teams.
Apply AI responsibly
◦ Get tools, services, and guidelines to help you use
AI responsibly, while also preserving data privacy,
transparency, and trust.
Tools for Machine learning experiments visualization
Neptune:
◦ Neptune is a metadata store for MLOps, built for
teams that run a lot of experiments.‌
◦ It gives you a single place to log, store, display,
organize, compare, and query all your model-building
metadata.
◦ Neptune is used for:‌
 Experiment tracking: Log, display, organize, and compare ML
experiments in a single place.
 Model registry: Version, store, manage, and query trained models and
model building metadata.
 Monitoring ML runs live: Record and monitor model training,
evaluation, or production runs live.
Weights & Biases:
◦ Weights & Biases is a machine learning platform for
developers to build better models faster.
◦ It lets you quickly track experiments, version and iterate on
datasets, evaluate model performance, reproduce models,
visualize results and spot regressions, and share findings with
colleagues.
Comet:
◦ Comet is an ML platform that helps data scientists track,
compare, explain and optimize experiments and models across
the model’s entire lifecycle, i.e. from training to production.
◦ Comet is available for teams, individuals, academics,
organizations, and anyone who wants to easily visualize
experiments, facilitate work, and run experiments.
◦ It can be used as a hosted platform or deployed on-premise.
Sacred + Omni board:
◦ Sacred is open-source software that allows machine learning
researchers to configure, organize, log, and reproduce
experiments.
◦ Sacred does not come with its proper UI but there are a few
dash boarding tools that you can connect to it, such as Omni
board
MLflow:
 MLflow is an open-source platform that helps manage
the whole machine learning lifecycle.
 This includes experimentation, but also model storage,
reproducibility, and deployment.
 Each of these four elements is represented by one
MLflow component:
 Tracking, Model Registry, Projects, and Models. 
Computer vision
Computer vision is a field of artificial intelligence
(AI) that enables computers and systems to derive
meaningful information from digital images,
videos and other visual inputs — and take actions
or make recommendations based on that
information.
Computer vision is one of the core areas of
artificial intelligence (AI), and focuses on creating
solutions that enable AI applications to "see" the
world and make sense of it.
Potential uses for computer vision include:
Content Organization:
◦ Identify people or objects in photos and organize them
based on that identification. Photo recognition
applications like this are commonly used in photo
storage and social media applications.
Text Extraction:
◦ Analyze images and PDF documents that contain text
and extract the text into a structured format.
Spatial Analysis:
◦ Identify people or objects, such as cars, in a space and
map their movement within that space.
Natural Language Processing
Natural language processing supports applications that
can see, hear, speak with, and understand users.
Using text analytics, translation, and language
understanding services, Microsoft Azure makes it easy
to build applications that support natural language.
Text Analytics Techniques
 Text analytics is a process where an artificial intelligence
(AI) algorithm, running on a computer, evaluates these
same attributes in text, to determine specific insights.
 A person will typically rely on their own experiences and
knowledge to achieve the insights.
 A computer must be provided with similar knowledge to be
able to perform the task.
Statistical analysis of terms used in the text.
◦ For example, removing common "stop words" (words like
"the" or "a", which reveal little semantic information
about the text), and performing frequency analysis of the
remaining words (counting how often each word appears)
can provide clues about the main subject of the text.
Extending frequency analysis
◦ To multi-term phrases, commonly known as N-grams (a
two-word phrase is a bi-gram; a three-word phrase is a tri-
gram, and so on).
Applying stemming or lemmatization 
◦ Algorithms to normalize words before counting them - for
example, so that words like "power", "powered", and
"powerful" are interpreted as being the same word.
 Applying linguistic structure rules to analyze sentences
◦ for example, breaking down sentences into tree-like structures
such as a noun phrase, which it contains nouns, verbs, adjectives,
and so on.
 Encoding words or terms
◦ as numeric features that can be used to train a machine learning
model. For example, to classify a text document based on the
terms it contains. This technique is often used to perform sentiment
analysis, in which a document is classified as positive or negative.
 Creating vectorized models
◦ that capture semantic relationships between words by assigning
them to locations in n-dimensional space.
◦ This modelling technique might, for example, assign values to the
words "flower" and "plant" that locate them close to one another,
while "skateboard" might be given a value that positions it much
further away.
In Microsoft Azure, the Language cognitive service can
help simplify application development by using pre-
trained models that can:
◦ Determine the language of a document or text (for
example, French or English).
◦ Perform sentiment analysis on text to determine a
positive or negative sentiment.
◦ Extract key phrases from text that might indicate its
main talking points.
◦ Identify and categorize entities in the text. Entities can
be people, places, organizations, or even everyday
items such as dates, times, quantities, and so on.
Explore conversational AI
How Does Conversational AI Work?
 Drivenby underlying machine learning and deep neural
networks (DNN), a typical conversational AI flow includes:
◦ An interface that allows the user to input text into the system or
Automatic Speech Recognition (ASR), a user interface that
converts speech into text. 
◦ Natural language processing (NLP) to extract the user's intent from
the text or audio input, and translate the text into structured data.
◦ Natural Language Understanding (NLU) to process the data based
on grammar, meaning, and context; to comprehend intent and
entity; and to act as a dialogue management unit for building
appropriate responses.
◦ An AI model that predicts the best response for the user based on
the user's intent and the AI model's training data. Natural Language
Generation (NLG) infers from the above processes, and forms an
appropriate response to interact with humans.
How to create Conversational AI?
 Startby understanding your use cases and requirements.
 Choose the right platform and toolkit.
 Build a prototype.
 Deploy and test your Chatbots.
 Optimize and improve your Chatbots.
Differences between Traditional and AI Powered Chatbots:
Conversational AI Challenges
 Developing natural language processing (NLP) capabilities that can
understand and interpret human interactions.
◦ This is a complex task that requires significant effort and investment in research
and development. 
 Understanding the context of a conversation in order to provide
accurate responses.
◦ This can be particularly challenging in conversations that involve multiple people
or multiple topics.
 The need for sophisticated design and development efforts to create a
customer experience that engages users and keeps them engaged in
the conversation.  
 Deploying and integrating a Conversational AI solution into an
existing business or application can be a significant challenge.
◦ Proper planning and execution are essential to ensure a successful deployment. 
 As conversational AI permeates global CX platforms, local language
support becomes a high priority.
◦ Leading brands operating worldwide can’t rely on availability in just one
language to meet local needs at scale. Building a robust conversational AI
platform to operate in regional languages, dialects, slang, noisy environments,
with crosstalk, etc., is a huge challenge.
 Dialogue management and conversation design are non-trivial parts
of conversational AI.
◦ Annotating the intelligence gathered from real agent conversations and building
the right model-training data requires ongoing human-in-the-loop expertise.
 Building a conversational AI-based application that takes into
consideration intent, entity extraction, sentiment analysis, and
empathy is challenging and very few vendors offer solutions with
these features.
 Keeping automated conversations relevant can also be a real
challenge, with customer needs and preferences changing faster than
ever before. 

You might also like