Last month, at an event in Bangalore, a select audience was given a glimpse of what the future of artificial intelligence (AI) might look like. In the room were companies from across the AI ecosystem, presenting what they were building and getting to know what else was going on in the space. There were product demonstrations, ecosystem presentations and workshops, all of which were rolled up into a grand vision of what it would take to make India the AI-use-case capital of the world.️️️ I came away with a number of new insights from the event.
For instance, I realized that compute does not have to be centralized in a massive data centre, but could instead be distributed across a number of micro data centres accessible over an interoperable protocol. Or that we do not need to wait for a new AI law to come into force if every Indian AI company can pledge to abide by a code-of-conduct that will guide their innovation in this space. But, of all the ideas that were presented that evening, the one that really gave me pause was how AI could radically expand the reach of our digital public infrastructure (DPI).
An essential feature of the DPI playbook is the unbundling of traditional processes so that they can be rebuilt as DPI solutions. This is very often the only way to transform existing services so that they can reach population scale. Standardization, therefore, lies at the heart of effective DPI design.
It ensures that the largest cross-section of society can avail these services, and that, regardless of which entity is providing the solution, the underlying service is consistent and reliable. DPI solutions that have been built using standardized protocols are capable of supporting cross-sectoral applications. The trouble
. Read more on livemint.com