ARTIFICIAL INTELLIGENCE;- ALL YOU NEED TO KNOW:-
FACTS
Vaibhav Patil- vaibhav0222@gmail.com
INTRODUCTION
Human have memory of 10 lacs GB capacity
John McCarthy
it was the father of artificial intelligence
Machine can learn like human or people
Artificial intelligence definition
Make computers aur machine do or perform things
which human are doing better
If if you prepared machine then same work ok
machine can do with lot of perfection human cannot do do that work with
precision perfection that's why artificial intelligence is important to work
more in less time and less resources
Machine can learn demonstrate explain an advice
to the user as well as machine can think behave and adjust automatically
Artificial intelligence solve generic questions
New modification can be added or adopted
Quick and easy modification is allowed and
accepted
Maximum thing you will see in artificial
intelligence are intangible
How it works-
In artificial intelligence computer or machine
uses their own power to learn
When recognising image computer check pixel to
pixel and keep it in mind for image processing
In artificial intelligence pattern and image
recognition language recognition play important role
Artificial intelligence also play important role
in in startups
In artificial intelligence objectives to create
expert system and put human intelligence in machine
What makes human being different from machine is
Human being can ask question use logic search
answer find possibilities search shortcuts that how they are intelligent than
machine so in artificial intelligence we want to implement all such qualities
of of human being in machine
Following fields play ported role in
artificial intelligence
Robotic
Machine learning machine language
Computer vision
Natural language processing
Computer science
linguistic
neuroscience
psychology
engineering
maths
Products of artificial intelligence
AMbiclimate
This is the product where machine automatically identify the
temperature humidity sunlight and accordingly e changes the temperature of the
room as per your requirement and as per your your daily pattern
Face detection
In artificial intelligence machine can recognise face and facial
expression to use and process further decisions
China is using face detection artificial intelligence for the
surveillance and breaking of traffic rules so that effective governance is easy
Self driving car is one more example of of artificial intelligence
with the help of GPS
Banjour
This is the product where all morning stuff is taken care air like
waking up alarm preparing tea reading of news appointments except
There are many products which will help you in day to day working
and discussion
Write Amazon echo &
Alexa
Examples
Google assistant
Siri
Voice assistant
Face recognition app face app
Socritic
which will find answer in matter of seconds
Replica
For the cosmetic help
Google duplex help in sales and marketing work
Netflix also uses artificial intelligence in video suggestion to their
customers
BBC is using artificial intelligence in writing article they don't write
everything they just tell the software air important things and software
automatically prepare article
In Windows we have Cortana
In Apple we have Siri
In smartphone we have speech recognition
We have intelligent toys and robots
Tesla automatic car is using artificial
intelligence
Amazon Eco is also using artificial intelligence
Google assistant can take appointment from doctor
or or salon
Facebook
suggest you two friends depending on artificial intelligence
If you search any product on Amazon you keep on
getting advertisement related to that product on all the websites
Sofia humanoid robot is another example of of
artificial intelligence manufactured by Hanson robotics in Hong Kong
In artificial intelligence machine keep learning
understanding and improving themselves like Sophia robot
She can speak accordingly as per your question
and intention of speaking and modify her answers day by day
Because of artificial intelligence machine can distinguish between
Police vehicle and other vehicles
Different types of fishes can be recognised by their images
In one of the milestone IBM computer defeated word chess champion
Calculation is not intelligence seen by the the chase example
Artificial intelligence can help in preparing satellite station on
its own in this space
In artificial intelligence can be used to study DNA and cure
diseases
Google translator can translate Chinese boards in English language
after showing in front of camera
Image processing is another example image recognition is another
example
Speech recognition is one more example
Artificial intelligence is helpful in
following fields
Banking and finance
Retail e-commerce
Security agent cyber security
Data analysis and customer segmentation
HR management
Healthcare management
Supply chain and logistic
Machine learning vs Artificial
intelligence
How much learning is different from artificial intelligence
If you teach any machine pattern of of face of dogs then that
machine can identify dogs but when you show them cat it will confuse and it
will not work so that is machine learning
If you teach any machine to play chess then that machine can play
only chess and if you show football for hockey that machine will not work so
again that is machine learning
In artificial intelligence in above two examples machine will try
to identify cat and after similar encounters keep on updating their experience
and finally e machine will identify cat in same way machine will also learn to
play chess football and hockey after some time on their own
We give access to the data to machine and say learn and process on
your own
In machine learning machine can perform better action in just one
field and it cannot be generalize and performs in other field
Like if there is a machine for washing clothes that machine cannot
perform the action of of cooking
In artificial intelligence future results are depend on past
results
Machine learning from the pattern of of use aur performance
Search engine optimisation is also example of artificial
intelligence in which demographic characters are studied and proper reserves
are suggested for user or customer
Artificial intelligence we are preparing machine which will work
like human being
Artificial intelligence can create jobs in
following fields
Computer analytic network analytic cloud engineer
database administration
Danger
There is a possibility like robots may rebel against human being
and fight against them and destructor human race
Market size
A study by Accenture in December said AI could add $957 billion to the Indian economy or increase the country’s income by 15% by 2035 by changing the nature of work to create better outcomes.
Example 1
Kroger
Kroger plans to leverage its data, shopper
insights and scale to help it remain a leader in the marketplace of the future.
The grocer already delivers 3 billion
personalised recommendations each year, but they will enhance the
personalization efforts to "create different experiences for customers
Kroger is testing the delivery of the
future—grocery delivery by an autonomous vehicle
A partnership between Kroger and British
online-only grocer Ocado is expected to help Kroger automate its warehouses
Kroger’s in-house analytics firm 84.51
deployed Kroger Precision Marketing that uses customer purchase data from
Kroger’s 60 million shopper households to launch marketing campaigns across a
digital spectrum.
Smart shelves
When a Kroger customer walks down the
aisle with the Kroger app open, sensors identify the shopper and provide
personal pricing and highlight products the customer might be interested in via
smart shelves technology.
TESLA:-
The
Amazing Ways Tesla Is Using Artificial Intelligence And Big Data
Tesla
CEO Elon Musk publicly announced it is working on its own AI hardware.
Tesla will process the “thinking” algorithms
for the company’s Autopilot software which currently gives Tesla vehicles limited
(“level 2”) levels of autonomous driving capability. Musk has said that he
believes his cars will be fully autonomous (level 5 autonomous) by 2019.
But
as a business decision, it is hoping its pushy tactics will pay off, with
experts concluding that the company has trumped its rivals in the
data-gathering department. All the vehicles Tesla have ever sold were built
with the potential to one day become self-driving, although this fact was not
made public until 2014 when a free upgrade was rolled out. This means the
company has had a lot more sensors out on the roads gathering data than most of
its Detroit or Silicon Valley rivals, many of which are still at the concept
stage. Having just launched its first mass-market car, the Model 3 with a price
tag of $35,000, the company is expecting the number of its vehicles on the road
to increase by almost two thirds to around 650,000 in 2018 – and that’s a lot
of extra sensors.
In
fact, all Tesla vehicles – whether or not they are Autopilot enabled – send
data directly to the cloud. A problem with the engine operation meaning that
components were occasionally overheating was diagnosed in 2014 by monitoring
this data and every vehicle was automatically “repaired” by software patch
thanks to this.
Tesla
effectively crowdsouces its data from all of its vehicles as well as their
drivers, with internal as well as external sensors which can pick up
information about a driver’s hand placement on the instruments and how they are
operating them. As well as helping Tesla to refine its systems, this data holds
tremendous value in its own right. Researchers at McKinsey and Co estimate that
the market for vehicle-gathered data will be worth $750 billion a year by 2030.
The
data is used to generate highly data-dense maps showing everything from the
average increase in traffic speed over a stretch of road, to the location of
hazards which cause drivers to take action. Machine learning in the cloud takes
care of educating the entire fleet, while at an individual car level, edge computing
decides what action the car needs to take right now. A third level of
decision-making also exists, with cars able to form networks with other Tesla
vehicles nearby in order to share local information and insights. In a near
future scenario where autonomous cars are widespread, these networks will most
likely also interface with cars from other manufacturers as well as other
systems such as traffic cameras, road-based sensors or mobile phones.
Nvidia
state that “In contrast to the usual approach to operating self-driving cars,
we did not programme any explicit object
detection, mapping, path planning or control components into this car. Instead,
the car learns on its own to create all necessary internal representations
necessary to steer, simply by observing human drivers.”
Whatever
new tech it develops may veer away from this by stepping back into the more
tested waters of supervised learning, where algorithms are trained beforehand
about right or wrong decisions. However, it is possible that the theoretically
greater gains achievable by truely unsupervised learning may keep them on this
track.
Tesla
has clearly always been a company which has put data collection and analysis at
the heart of everything it does. It isn’t just design and manufacturing either,
with the company processing customer data with AI and even parsing it’s online
forum for text insights into common problems.
John Deere
Pesticides
are currently an essential ingredient of big agriculture in order to ensure we
can continue to feed the ever-growing global population of our planet.
The
Incredible Ways John Deere Is Using Artificial Intelligence To Transform
Farming
Computer
vision specialist Blue River Technology has developed a solution for exactly
that, using advanced machine learning algorithms to enable robots to make
decisions, based on visual data (just as we would do ourselves) about whether
or not a plant is a pest, and then deliver an accurate, measured blast of
chemical pesticides to tackle the unwanted pests. Given that traditionally such
decisions are made on a field-by-field basis, rather than plant-by-plant basis,
the opportunities for efficiency are clear.
Farm
equipment and services giant John Deere saw the potential of this development
and acquired the start-up late last year and added it to the catalogue of high
tech, data-powered services it already offers its customers.
It
is just the latest move in John Deere’s push to put data-driven analytical
tools and automation in the hands of farmers. With the rate of global population
growth, the company – established in 1837 as a tool manufacturer – understands
that they serve an industry where small efficiencies quickly add up to big
competitive advantages.
Already
the firm enables automated farm vehicles to plough and sow, under the control
of pinpoint-accurate GPS systems. On top of that its Farmsight system is
designed to enable data-driven insights to inform agricultural decision making,
based on shared user data from subscribers all around the world.
Luckily
infrastructure for gathering data which can be used to predict the effects of
these influences is increasingly available. Satellite imagery – previously
often prohibitively expensive – is more affordable than ever with one person I
spoke to recently comparing the cost of launching a satellite to launching an
app. Visual data is also available from unmanned aerial vehicles such as
quadcopter drones, which can be used to monitor the growth and spread of pest
through crops in real-time.
One
company specialising in analysis of satellite imagery last year came within 1%
of accurately predicting corn and soya yields by applying machine learning
algorithms to their data. It has already released its predictions for this
year’s season, which it claims will be more accurate.
The
large-scale mechanisation of agriculture means that accurate data is available
from the machines which spread seeds and harvest crops. Robots - such as those
developed by Google Funded Abundant Robotics which suck ripe fruit from branches
with vacuums -naturally record everything they do and every parameter of their
operation. This structured machine data meshes well with unstructured data from
meteorological or satellite imagery, and when filtered through AI algorithms
will provide insights that more accurately predict yields and losses.
Google -2
Google
services such as its image search and translation tools use sophisticated
machine learning which allow computers to see, listen and speak in much the
same way as human do.
Machine
learning is the term for the current cutting-edge applications in artificial
intelligence. Basically, the idea is that by teaching machines to “learn” by
processing huge amounts of data they will become increasingly better at
carrying out tasks that traditionally can only be completed by human brains.
The
Amazing Ways Google Uses Artificial Intelligence And Satellite Data To Prevent
Illegal Fishing
These
techniques include “computer vision” – training computers to recognise images
in a similar way we do. For example, an object with four legs and a tail has a
high probability of being an animal. And if it has prominent whiskers too, it’s
more likely to be a cat than a horse. When fed thousands, or millions of images
it will become increasingly good at deciding what an image represents.
Another
is “natural language processing”. This is used in Google’s online real-time
language translation service to understand nuances of human speech in any
language, allowing more accurate translation between human languages.
Google
also uses machine learning in its Nest “smart” thermostat products – by
analysing how the devices are used in households they become better at
predicting when and how their owners want their homes to be heated, helping to
cut down on wasted energy.
However,
besides these everyday uses Google has developed many more specialised
applications of the technology, which today are in use helping to solve a
variety of environmental problems around the world.
Google’s
sustainability lead, Kate E Brandt spoke to me about some of these ambitious
use cases where artificial intelligence is being deployed today.
She
said “We’re seeing some really interesting things happen when we bring together
the potential of cloud computing, geo-mapping and machine learning.”
One
great example is an initiative which is already helping to protect vulnerable
marine life in some of the world’s most delicate eco-systems. Using the
publicly broadcast Automatic Identification System for shipping, machine
learning algorithms have been shown to be able to accurately identify illegal
fishing activity in protected areas.
This
works in much the same way as the “cat or horse?” example for image recognition
I gave above. By plotting a ship’s course and comparing it to patterns of
movement where the ship’s purpose is known, computers are able to “recognise”
what a ship is doing.
Brandt
told me “All 200,000 or so vessels which are on the sea at any one time are
pinging out this public notice saying ‘this is where I am, and this is what I
am going.”
This
results in the broadcasting of around 22 million data points every day, and
Google engineers found that by applying machine learning to this data they were
able to identify the reason any vessel is at sea – whether it is a transport
ferry, container ship, leisure vessel or fishing boat.
“With
that dataset, and working with a couple of wonderful NGOs – Oceana and Sky
Truth – we were able to create Global Fishing Watch – a real-time heat map that
shows where fishing is happening,” says Brandt.
The
initiative has already led to positive outcomes in the fight against illegal
fishing in protected marine environments. For example, the system identified
suspicious activity in waters under the jurisdiction of the Pacific island
nation of Kiribati – which include the world’s largest UNESCO heritage marine
site. When intercepted by Kiribati government vessels, the captain of the
fishing vessels denied any wrongdoing. But after being presented with evidence
gathered by Google’s machine learning algorithms, he realised he had been
caught red-handed and admitted the violation of international law.
“What’s
really exciting is that this creates tremendous opportunities for governments
and citizens to protect our marine resources. Fishing in those marine reserves
is illegal and Global Fishing Watch has been used to protect those reserves.”
Machine
learning-driven image recognition is also used for a very different purpose, on
land this time, and across the United States as well as Germany.
Project
Sunroof, launched in 2015, involves training Google’s systems to examine
satellite data and identify how many homes in a given area have solar panels
mounted on their roofs. As well as that, it can also identify areas where the
opportunity to collect solar energy is being missed, as no panels are
installed.
“This
started with one of our engineers living in Cambridge, Massachusetts, who
wanted to put solar panels on his roof but was finding it hard to figure out if
he was living in a good location – did he have enough sunlight to work with?”
Brandt tells me.
This
resulted in the development of a machine learning system which took Google
Earth satellite images, and combined it with meteorological data, to give an
instant assessment of whether a particular location would be a good candidate
for solar panels, and how much energy – as well as money – a householder might
save.
“Then
we realised this was not only really useful for individual home owners, but it
could be very useful for communities – at county, city or state level – to
assess their potential.”
Google’s
image recognition algorithms were trained to recognise how to spot solar arrays
in satellite images. This system was quickly put to use by the city of San Jose
in California as part of an initiative to identify locations where 1 gigawatt
of solar energy could be generated from new panels.
Both
of these initiatives are great examples of how machine learning – powered by
publicly available datasets - are enabling new solutions to problems of the
modern age. As more data becomes available, and computers become increasingly
powerful, who knows what other challenges artificial intelligence will help us
to overcome?
McDonald's
Personalised and improved customer experience
Not
only can customers order and pay through the McDonald’s mobile app and get
access to exclusive deals, but when customers use the app, McDonald’s gets
vital customer intelligence about where and when they go to the restaurant, how
often, if they use the drive thru or go into the restaurant, and what they
purchase. The company can recommend complementary products and promote deals to
help increase sales when customers use the app. In Japan, customers who use the
app spend an average of 35% more thanks in part to the recommendations they are
provided at the time they place an order. favourite orders are then saved by
the app and offer a way to encourage repeat visits. App users can avoid the
lines at the drive thru and at the counters, reason enough for many to share
their buying data in exchange for convenience and perceived perks.
Digital menus that use data
McDonald’s
continues to roll out new digital menus. These aren’t just fancier versions of
the old menus, these menus can change based on the real-time analysis of data.
The digital menus will change out the options based on time of day and even the
current weather. For example, on a cold, blustrey day, the menu might promote
comfort foods while refreshing beverages might be highlighted on a record heat
day. They’ve been used in Canada and resulted in a 3% to 3.5% increase in
sales.
Trends analytics
Embracing
a data-driven culture is also important to help McDonald’s better understand
performance at each individual restaurant as well as uncover best practices
that can be shared with other restaurants in the chain. Since McDonald’s uses a
franchise business model, consistency of food and experience is important
across the franchise. It’s important from the customer’s perspective to
experience the same food and offerings from one restaurant to another no matter
where they are located or who it’s owned by. The company looks at multiple data
points in the customer experience. For example, when they look at the
drive-thru experience they not only assess the design of the drive-thru, but
they review the information provided to the customer and what’s happening for
customers waiting in line to order. They analyse the patterns in an effort to
make predictions and alter design, information and people practices if
necessary.
Kiosks and interactive terminals
As
one solution to the increasing costs of labour, McDonald’s is replacing
cashiers in some locations with kiosks where customers can place their order on
a digital screen. Not only are labour costs reduced, but the error rates go down.
By the end of 2018, you can expect an ordering kiosk to be available at a
McDonald’s near you. McDonald’s France is also testing out interactive
terminals. Once a customer places an order they take a connected RFID card
associated with the order to their table. When the order is ready, a McDonald’s
staff person locates the customers through the RFID card and then delivers
their meal to them.
Refreneces:
No comments:
Post a Comment