Indian Institute of Science to Set Up New AI/ML Centre – OpenGov Asia

The Indian Institute of Science in Bangalore (IISc), in collaboration with a private player, announced it would establish a state-of-the-art artificial intelligence and machine learning (AI/ML) centre at the IISc campus. Spread across approximately 140,000 square feet, the centre will offer Bachelor’s, Master’s, and short-term courses in areas AI/ML, deep learning, fintech, reinforcement learning, image processing, and computer vision.
The centre will also promote research and innovation in AI/ML and develop the talent pool from across the country to provide cutting-edge solutions to meet the industry’s emerging and future requirements. According to a statement, as IISc continues to deliver on its mandate to provide advanced scientific and technological research and education, its partnerships with forward-thinking institutions will help it scale up substantially and position India as a deep tech innovation hub.
As per a recent report, the global AI market size is expected to gain momentum and reach US$360.36 billion by 2028 while exhibiting a CAGR of 33.6% between 2021 to 2028. AI has become immensely popular, and industries across the globe are rapidly incorporating it into their processes to improve business operations and customer experience. The Indian government is developing and implementing several AI-driven initiatives in education, healthcare, agriculture, and finance. Educational institutes and government agencies are launching centres and offering courses in emerging technology to help build a skilled workforce.
For instance, earlier this year, India’s Ministry of Finance entered a strategic partnership with a tech giant to build a Centre of Excellence in AI and emerging technologies at the Arun Jaitley National Institute of Financial Management (AJNIFM). The centre will serve as a central body for research, AI scenario envisioning, and technology-led innovation. The two sides would jointly explore use cases of emerging technologies in finance and related areas, across central and state ministries and public sector enterprises. Also, public sector officials would be trained on the application of emerging technologies in finance management to address potential risks like money laundering, the use of machine learning models for decision making, and the role of responsible tech in finance, among others. As OpenGov Asia had reported, the collaboration would explore the role of cloud, AI, and emerging technologies in transforming and shaping the future of public finance management in India.
More recently, the Indian Institute of Technology in Madras (IIT-Madras), in association with a semiconductor manufacturing company, is offering a free workshop on AI and high-performance computing technology in semiconductor manufacturing. The workshop is being offered under the “National Supercomputing Mission Industry Talks” series. The workshop is free; however, it is mandatory for interested participants to register before attending the workshop online.
According to a news report, the workshop will be conducted from 27 September to 1 October every day for one hour, and the e-meeting details will be sent only to registered participants. The workshop is open to all interested participants, and the following topics will be covered by industry experts over five days:
Federal agencies looking for cloud solutions may soon be able to check with the General Services Administration’s (GSA) one-stop-shop cloud marketplace. The marketplace would feature both post-award contract management tools and professional IT services, along with a “foundational set of requirements” to ensure cloud solutions comply with a baseline set of security requirements and the Federal Risk and Authorisation Management Program guidance.
“We are looking at how we put together a cloud marketplace that then becomes a buying platform for agencies. We want to put together not just a framework, but a market contractual vehicle that will allow our agencies to buy these core cloud services that we are seeing them need more and more.”
Assistant Commissioner for the Office of Information Technology Category in GSA’s Federal Acquisition Service (FAS)
GSA has released a steady stream of guidelines around buying cloud services and solutions in recent years, and it set up a cloud information centre to equip agencies with crowdsourced, strategic acquisition resources. The agency eventually realized a vehicle was needed to effectively serve government agencies and industry stakeholders, who oversees more than 7,000 contracts and nearly $30 billion in annual government spending.
Some agencies have to go multiple places to buy cloud, hence GSA decided it was time to take the next step. A request for information is scheduled for release in the coming weeks, the GSA official noted, adding that input from industry stakeholders helps them understand how they need to make decisions.
Changes will not be overnight, but often incremental. FAS Commissioner has already warned that efforts to transform the agency’s buying and selling experience is a long term project. Agencies are working on a set of systems, processes and experiences that have been built over the last 30 years. In fact, today, some of the systems are 40 years old. It is not about just adding a little automation to connect the dots, they have to fundamentally rethink some of these things.
U.S. Administration has developed a new strategy to accelerate agency adoption of cloud-based solutions: Cloud Smart. Cloud Smart equips agencies with actionable information and recommendations gleaned from some of the country’s most impactful public and private sector use cases. Beyond Cloud First, which granted agencies broad authority to adopt cloud-based solutions, Cloud Smart offers practical implementation guidance for Government missions to fully actualiSe the promise and potential of cloud-based technologies while ensuring thoughtful execution that incorporates practical realities.
The new strategy is founded on three key pillars of successful cloud adoption: security, procurement, and workforce. Collectively, these elements embody the interdisciplinary approach to IT moderniSation that the Federal enterprise needs to provide an improved return on its investments, enhanced security, and higher quality services to the American people.
Cloud Smart operates on the principle that agencies should be equipped to evaluate their options based on their service and mission needs, technical requirements, and existing policy limitations. Computing and technology decisions should also consider customer impact balanced against cost and cybersecurity risk management criteria. Additionally, agencies need to weigh the long-term inefficiencies of migrating applications as-is into cloud environments against the immediate financial costs of modernising in advance or replacing them altogether.
As reported by OpenGov Asia, the COVID-19 pandemic revealed how big data, analytics, and cloud technology are being used in the public health sector. Cloud computing can help public health agencies scale up to accommodate the new data load, with architectures that auto-scale and adapt to changing flows. But the systems themselves must also be architected to support the horizontal scaling enabled by cloud computing.
Cyberattacks are a growing threat to critical infrastructure as they are getting increasingly sophisticated. Hence Ministry of Defence (MINDEF) Singapore is seeking to improve its cyber defences by further training its experts and studying the methods employed by hackers.
MINDEF and the Singapore University of Technology and Design (SUTD) have signed a Memorandum of Understanding (MOU) on Operational Technology (OT) security for critical infrastructure. The MOU formalises the partnership between MINDEF and SUTD on OT security and will strengthen collaboration in areas including research and technology, threat modelling, training and expertise development.
The MOU underscores MINDEF’s and the Singapore Armed Forces’ (SAF) commitment to building up cybersecurity expertise and capabilities against potential OT cyber threats. Recent cyberattacks on critical infrastructure such as fuel pipelines and power distribution systems are stark reminders of the increasingly sophisticated cyber threats that countries face. To this end, MINDEF/SAF is working with partners from academia and industry to collaborate in cybersecurity research and technology.
MINDEF/SAF recognises the importance of working with key partners like SUTD to keep pace with the latest developments in cybersecurity research and technology. This MOU will formalise their collaboration with SUTD to harness their strengths in cybersecurity research to enhance their own capability development projects. It will also support the training and education of their cyber personnel who are tasked to defend the digital borders.
The MOU will conclude collaboration between MINDEF/SAF and the SUTD iTrust Centre for Research in Cyber Security (iTrust) in key areas including:
OT systems include computer systems designed to be deployed in critical infrastructures such as power, water, manufacturing and similar industries. Such infrastructure overseas has been hit by hackers recently. OT infrastructure and enhancements have been used in projects such as energy-efficient buildings.
As reported by OpenGov Asia, the digital world becomes a larger part of people’s lives, cyber threats increase in both impact and frequency. The global trend in the cyber landscape is for attacks to be less profit-motivated. To address this, a Singaporean telecommunications provider is launching a new suite of cybersecurity solutions developed in collaboration with an India-based global IT solutions provider. Cybersecurity consulting, incident response, data protection, vulnerability testing, managed firewall, and managed endpoint are all part of the new suite of solutions
The offering is primarily aimed at Singapore’s small and medium-sized enterprises (SMEs), with the telco claiming that the new suite comes at a time when nearly half of all reported crimes in Singapore are cybercrime-related. Indeed, the Cyber Security Agency of Singapore (CSA) last month flagged an increase in cyber threats, such as ransomware and online scams, during 2020.
The trial Mobile Money service approved by the Prime Minister will set a precedent for applying a “sandbox” scheme for new services and professions in the digital society. Sandbox is a controlled institutional framework applied to new technologies, products, services, and business models. It is an environment for technology firms to try their new technological apps and business models. After the trial period, management agencies will review the trial implementation and then accept or reject it.
Using laws to set rules to deal with new issues arising from the application of new technologies is a challenge. As per a press release, the apps may have a rapid impact on society that management systems may not be able to keep up with. Many traditional business fields have changed, and businesses have to utilise technology to work more effectively. It is impossible to manage new services and business models within the existing framework because policies tend to lag behind practices. Therefore, a sandbox model is more advantageous.
According to an industry expert, it is impossible to demand state management agencies to create policies for the future. Many countries apply sandbox policies to encourage enterprises to develop new business models, with certain limitations in deployment. The Prime Minister has put into effect the pilot implementation of Mobile Money services – making payments for small-value goods and services with telecom accounts. The pilot programme will last two years.
This is the first service that the government has applied the sandbox mechanism managed by several ministries and branches. The government hopes the service will contribute to the development of non-cash payments, and promote the access and use of financial services, especially in rural areas. Businesses can only provide Mobile Money to remit money and make payments for legal goods and services in Vietnam in accordance with current laws. Mobile Money is only applied to domestic transactions with a monthly transaction value limit of VND10 million (US$4,397).
Vietnam is not the first country that has accepted a new technology platform, but experts said that it has an advantage by learning lessons from predecessors. In Vietnam, the proportion of credit card users is still low, but mobile subscriber density is very high. 99% of transactions with a small value of below VND100,000 (US$4) are carried out in cash. Mobile Money will be a strong solution to promote non-cash payments in society.
The Minister of Information and Communications stated that Mobile Money is a convincing example that shows that telecom carriers can become platforms for many things, not only telecom infrastructure. They can become platforms for data, computing, digital content, authentication, IT services, and the Internet of Things (IoT).
Mobile Money is expected to help Vietnam become a digital society. The project is the first sandbox involving many ministries and sectors to be piloted to meet the needs of society. It will pave the way for more sandboxes to be applied to other new services and business models in the future. He added that Mobile Money is a great opportunity for mobile network operators to build an ecosystem to accelerate digital transformation.
The National Economic and Development Authority (NEDA) said it is accelerating the implementation of the Philippine Identification System (PhilSys) or the national ID programme to enhance the government’s ability to deliver various social services. NEDA chair said that more than 42 million Filipinos had registered as of September for step 1 or collection of demographic data. Notwithstanding the quarantines, he said nearly 30 million took the second step to supply their biometrics in the registration centres.
OpenGov Asia reported that the National Economic and Development Authority (NEDA) expects the Philippine Statistics Authority (PSA) to register 50 to 70 million people for the national digital ID by the end of the year. It is noted that as of July 2, 37.2 million people had completed Step 1, which involves the collection of demographic information, and 16.2 million had completed Step 2, which involves the capture of biometrics at designated registration centres.
The impact of the Covid-19 pandemic is challenging, but the Philippines has a solid foundation to recover at the right time. Reforms such as Rice Tariffication Law and the National ID are helping us restore our development trajectory and enabling the people, especially the poor, to access affordable food and better social services. 
– NEDA Chairperson
Speaking on the progress being made by the digital ID project, Socioeconomic Planning Secretary at NEDA said: “The COVID-19 crisis underscores the need to provide unhampered access to banking and social services for all Filipinos, especially the poor. Therefore, the President gave the directive to accelerate the implementation of the Philippine Identification System or PhilSys to provide all Filipinos with a unique and digitalised ID.”
He underlined that the Filipinos, particularly the poor, would be able to open bank accounts where cash transfers can be received directly. “We aim to register at least 50 million Filipinos by the end of this year,” he said.
The pandemic gave new promptness and highlighted the primacy of financial integration into government crisis containment and rehabilitation efforts. It showed the vital role of financial inclusion in social welfare and protection, as the transaction accounts became a necessary means to receive government cash support from the poorest and most vulnerable in the country.
As per NEDA’s Chairperson, PhilSys would also facilitate financial inclusion by providing every Filipino with a valid proof of identity, which is required for low-income earners to open bank accounts, receive cash transfers, and access other financial services.
Meanwhile, the Philippines’ state-run bank said it has signed up 5.3 million unbanked PhilSys registrants for their own transaction accounts via account opening booths at select PhilSys co-location areas nationwide. The registrants have already used their prepaid cards for a total of P31.8 million in transactions.
The PhilSys registrants can use the Landbank prepaid cards to manage funds, withdraw cash, perform cashless transactions, shop and pay bills online, and receive government subsidies digitally. As per the president and CEO of the Philippines bank, bringing unbanked Filipinos into the financial mainstream lays the groundwork for inclusive growth, particularly as we accelerate initiatives toward economic recovery and sustained development. Access to formal banking services motivates people to save money, repay loans, invest in financial products, and achieve financial independence.
Unbanked PhilSys registrants may access bank transaction accounts after completing the PhilSys Step 2 registration process, which includes validating supporting documents and capturing biometrics data. PhilSys registrants can also activate their bank prepaid cards and conduct transactions through the bank’s mobile branches, which are located in communities across the country where banking services are disrupted or limited.
The bank’s mobile branch is intended to serve unbanked and underserved communities as well as areas affected by disasters, calamities, and other disruptive events, as part of the bank’s increased efforts to promote greater financial inclusion in the community.
A research team led by biomedical engineers at the City University of Hong Kong (CityU) has developed a new generation of microneedle patches made of ice that melt after the pain-free delivery of drugs.
Experiments using this ground-breaking invention on mice with cancers have shown that the animals’ immune responses were much better than those seen in conventional vaccination methods. The technology paves the way for developing an easy-to-use cell therapy and other therapeutics against cancers and other diseases.
Made from a cryogenic solution, these icy microneedles are less than 1mm long and can deliver living mammalian cells into the skin. The device is like a skin patch and the microneedles can detach from the patch base, melt and then penetrate the skin.
The research is led by Dr Xu Chenjie, Associate Professor in the Department of Biomedical Engineering (BME), and the findings were published in Nature Biomedical Engineering under the title “Cryomicroneedles for Transdermal Cell Delivery”.
Dr Xu explained that traditional cell therapy for skin disorders is invasive, painful, complicated, low-efficient, risks infection, and requires experienced professionals. The ready-to-use device can circumvent complex and redundant procedures during each drug administration. In addition, it can be stored for months in a refrigerator and is easily transported and deployed.
The applications for this device are not limited to the delivery of cells. It can package, store, and deliver any type of bioactive therapeutic agents such as proteins, peptides, mRNA, DNA, bacterial, and vaccines, and it can improve both the therapeutic efficacy and patient compliance during cell therapies.
As a proof-of-concept, the researchers explored cell-based cancer immunotherapy through the intradermal delivery of ovalbumin-pulsed dendritic cells. Experiments showed that vaccination using therapeutic cells through this technology elicited robust antigen-specific immune responses and provided strong protection against tumours in mice.
These results were superior to the therapeutic outcomes of conventional vaccination methods. One of the start-up teams supported by the Seed Fund of HK Tech 300, CityU’s flagship innovation and entrepreneurship programme, is working on transferring the technology into a product and to promote its application.
Dr Chang Hao, a former postdoc in CityU’s BME, is the first author of this study, and Dr Xu is the corresponding author. Other researchers include Professor Wang Dongan and Professor Shi Peng from BME. The research team collaborated with scientists from Nanyang Technological University and the National University of Singapore.
The cell therapy technologies market is projected to reach US$5.6 billion by 2025 from US$2.8 billion in 2020, at a CAGR of 14.4% from 2020 to 2025. The emerging economies such as Australia and China are expected to provide a wide range of growth opportunities for players in the market which is driven by their large and growing populations as well as an increase in the number of clinical trials and investments in the field of personalized medicine in these countries.
The outbreak of COVID-19 is expected to have a minimal or negligible negative impact on the cell therapy technologies market. The rise in the incidences of COVID has led to an increase in the need for an efficient drug or vaccine for COVID, which could help in reducing the severity of the cases.
Cell-based research is an essential step during the manufacturing of vaccines, which can help in the growth of the market.
In the initial months of the outbreak of COVID, disruption in the supply chain had been witnessed, which has delayed the clinical trials. This can negatively impact the market to a certain extent. For instance, biopharmaceutical companies and major players have announced clinical trial delays.
A film is not complete without relevant and good music in the background. Music establishes atmosphere and mood and influences the audience’s emotional reactions as well as their interpretation of the story. A research team at the USC Viterbi School of Engineering sought to objectively examine the effect of music on cinematic genres. Their study aimed to determine if AI-based technology could predict the genre of a film based on the soundtrack alone.
While past work qualitatively indicates that different film genres have their own sets of musical conventions—conventions that make that romance film sound different from that horror movie—Narayanan and team set out to find quantitative evidence that elements of a film’s soundtrack could be used to characterise the film’s genre.
The study was the first to apply deep learning models to the music used in a film to see if a computer could predict the genre of a film based on the soundtrack alone. They found that these models were able to accurately classify a film’s genre using machine learning, supporting the notion that musical features can be powerful indicators in how people perceive different films.
This work could have valuable applications for media companies and creators in understanding how music can enhance other forms of media. It could give production companies and music supervisors a better understanding of how to create and place music in television, movies, advertisements, and documentaries in order to elicit certain emotions in viewers.
In their study, the team examined a dataset of 110 popular films released between 2014 and 2019. They used genre classification listed on the online database of information related to films to label each film as action, comedy, drama, horror, romance, or science-fiction, with many of the films spanning more than one of these genres.
They then applied a deep learning network that extracted the auditory information, like timbre, harmony, melody, rhythm, and tone from the music and score of each film. This network used machine learning to analyse these musical features and proved capable of accurately classifying the genre of each film based on these features alone.
The team also interpreted these models to determine which musical features were most indicative of differences between genres. The models didn’t give specifics as to which types of notes or instruments were associated with each genre, but they were able to establish that tonal and timbral features were most important in predicting the film’s genre.
The researchers examined the auditory information from each film using a technology known as audio fingerprinting. This technology allowed them to look at where the musical cues happen in a film and for how long. Using audio fingerprinting to listen to all of the audio from the film allowed them to overcome a limitation of previous film music studies, which usually just looked at the film’s entire soundtrack album without knowing if or when songs from the album appear in the film.
In the future, the team is interested in taking advantage of this capability to study how music is used in specific moments in a film and how musical cues dictate how the narrative of the film evolves over its course.
AI has been adopted in various areas, including healthcare. As reported by OpenGov Asia, U.S. Scientists have developed a new, automated, AI-based algorithm that can learn to read patient data from Electronic Health Records (EHR). The scientists, in a side-by-side comparison, showed that their method accurately identified patients with certain diseases as well as the traditional, “gold-standard” method, which requires much more manual labour to develop and perform.
Researchers from the Indian Institute of Technology in Bombay (IIT-Bombay) have developed a new data-processing technique to measure low amounts of soot accurately. This will help designers build better combustion-based devices such as internal combustion engines in cars.
Soot is tiny black particles that rise from a flame. Soot is formed when the fuel does not burn entirely. When fuel burns properly, a blue flame is emitted, whereas the flame is yellow when the soot is formed during burning and it becomes hot. Soot can cause cancer and respiratory and cardiac disorders and can also reduce the life of machine parts, a news report has explained.
Accurately measuring small amounts of soot can be a challenge and has spawned several research projects. The team from IIT-Bombay demonstrated a new technique to effectively reduce measurement errors when soot is present in low amounts. They analysed digital camera pictures of burning fuel to guess the temperature of the fuel and use the information to estimate the soot volume. The amount of soot can be measured using methods such as collecting and weighing the soot and studying a light beam shone on soot particles. The current study uses the last method. The researchers passed a beam of red laser light of a specific frequency, through a droplet of burning fuel and took images as it burnt. The light falling on the camera also contains the light from the burning fuel. The researchers used a narrow band filter to let only the laser light pass and filter out the light emitted by the burning fuel.
The report noted that when a flame having soot particles is shone with light, called background light, the particles absorb and scatter some of this light, so light reaching the camera is less bright. The researchers used the relation between the initial brightness of the laser light, the brightness of the light falling on the camera, and the soot volume to calculate the amount of soot. They then used a data-processing technique to compute the values of brightness from their images. Their challenge was to estimate the initial brightness of background light falling on soot particles since this isn’t directly captured in the images.
The team predicted the brightness of background light at every moment instead of using an average. They observed the flickers in background light at areas present outside the flame of the burning fuel, where there is no soot. They used it to estimate the background light falling on the soot particles. Using the new data processing technique, the team got lower errors, especially when the amount of soot produced is low. The technique does not require any additional equipment or extra expenditure, an added advantage.
The report added that to further reduce errors in the experiment, the researchers passed the laser light beam through a fixed and a rotating diffuser — a glass sheet that scatters light — before the light was incident on the burning fuel. A diffuser gives an evenly bright light and avoids the many speckles in the camera image. Speckles need to be removed while processing the data, leading to a loss of information. The researchers also validated their data processing technique. They used it to calculate the amount of soot for some previous measurements reported in the literature and verified the results. They also qualitatively checked their experimental observations.
They burnt a droplet of toluene (a carbon-based fuel) and compared their experimental observations with that in the literature. The team observed a similar peak value of the amount of soot. As expected, they saw high amounts of soot slightly inside the outer edges of the flame, where temperatures and fuel concentration are high, a researcher explained. The quantification of soot is crucial from an environmental perspective. This is an effective method to quantify soot to help identify strategies to mitigate combustion-based practices in India.

source