The ‘AI and Machine Learning Conference’ organized by UNICOM and IIM-C was held recently at the IIM-Bangalore campus on the 13th and 14th of March. A team from Capgemini, including myself as well as Sunil Kumar, were called to deliver a session on “Humanized Machines” followed by a Plenary session on “Humanized Machines: Use cases”.
It was two power-packed days of learning, discussions, and discovery that looked at trends in AI, Machine Learning and Sentiment Analysis, especially as they applied to financial and consumer markets. The event brought together subject matter experts from industry and academics from all over the world. Speakers at the event included the CEO of MarketPsych, a Senior Director with SapientRazorfish, a director from Amadeus, founders of various startups like Healthcare India, iShippo.com, Sentifi, data scientists from various organizations, HOD, professors and doctoral students of IIM-C, IIM-B and Great Lakes Institute of Management and many more.
Our session on “Humanized Machines” got off to a great start, as we were able to grab everyone’s attention the moment we asked Alexa to open the presentation and starting navigating using hand gestures. This immediately set up the context for our talk! A very important part of our vision is the ability to communicate with machines using the most natural forms of communication, such as voice, vision, text and gesture. This is what we call ‘Natural Digital Experience’ and we communicated this upfront in the very way we communicated.
The presentation itself covered the following areas:
- Our view on Natural Digital Experience (NDE), Digital Happiness and Swarm Intelligence to Assist Humans
- Overview of Machine Learning, Deep Learning and Reinforcement Learning
- NDE POC (Cognitive Remote Device Management) architecture and business challenges
- Few NDE frameworks and tools
- Swarm Intelligence industry use cases
Following this, we had the plenary session which was pretty interesting since it covered actual use cases with demos. We presented:
- A brief 2-min overview of NDE (for the audience who were not present in the earlier session)
- A Demo of our Cognitive Remote Device Management Solution, which manages Edge devices remotely using voice and gestures. To demonstrate, we asked Alexa to take a picture, which was executed via the webcam executed to our laptop. Following our directions, the picture was also automatically uploaded on Twitter using the hashtag #ThinkEdgeML – https://twitter.com/punjabimihir/status/973489915622076418?s=19
- A Demo of Coffee Machine Solution, for which we asked Alexa for the status of the coffee machine.
- Conversational commerce demo for banking – We presented a vision of how users will interact with banks in the future via our demo and how banks can provide insights to users.
- Swarm Intelligence demo – We presented a POC to share our vision of how Swarm Intelligence and Machine to Machine (M2M) collaboration can assist humans and provide Digital Happiness.
Both the sessions received an overwhelming response and were very interactive with queries and discussions continuing through the lunch and tea break session. We also received some encouraging feedback from the organizers with comments such as “the sessions were the most interactive so far” and “it was good to not see slides but actual use cases and demos”.
In the end, our core vision- that “AI should be used constructively to assist humans and provide them Digital Happiness” was well received by the audience and allowed many to start thinking of practical use cases based on latest, readily available technologies.
So that’s our quick and easy report on our participation in the Conference. I plan to go into greater details on our vision for AI and Machine Learning as well as on some of the use cases shown in a more detailed blogpost that I will be putting together soon. Stay tuned!