
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Government and enterprises, in an unprecedented period in history, have been compelled to accelerate and bring forward their digital transformation strategies. The pandemic has vaulted the governments and businesses into the next stage of digital transformation and online services.
Personalisation, efficiency and effective services are only possible with a comprehensive, 360o view of citizens and customers. This understanding is built on and powered by data. More than ever, data has become integral for organisations interested to get ahead of the curve, gaining a competitive advantage and engaging their customers more effectively.
To become a truly data-driven organisation that operates in real-time, agencies must deploy multiple modernisation initiatives, including application modernisation, artificial intelligence, machine learning, cloud, edge computing and analytics.
In that regard, Singapore has taken the lead in championing the use of data. Singapore has unveiled two new programmes to drive the adoption of artificial intelligence (AI) in the government and financial services sectors. It also plans to invest another SG$180 million ($133.31 million) in the national research and innovation strategy to tap the technology in key areas, such as healthcare and education.
The fund is on top of SG$500 million ($370.3 million) the government already has set aside in its Research, Innovation and Enterprise (RIE) 2025 Plan for AI-related activities, said the Smart Nation and Digital Government Office (SNDGO) in a statement in November 2021
These investments have been earmarked to support various research in areas that address challenges of AI adoption, such as privacy-preserving AI, and of societal and economic importance including healthcare, finance, and education. The funds also will facilitate research collaborations with the industry to drive the adoption of AI.
The future lies in harnessing data to deliver more effective and personalised services and the government has signposted the future with their policies. Agencies need a platform that draws together disparate applications, systems and teams with data being the backbone and making it easier to gain actionable insights. This platform should be able to unlock and repurpose the existing data for countless modern applications and use cases securely and efficiently.
The focus of the first day at the OpenGov Leadership Forum was aimed at unpacking the importance of data in empowering the public and private sectors to power mission outcomes, better serve citizens, ensure security and compliance, enhance IT efficiency and maximise productivity.
Morning Session
Powering a new world reality through data

Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.
“We are in the age of Metaverse,” Mohit opens. “While cryptocurrency was once viewed with suspicion – banks denounced it and people called it a hoax. Yet in 2022, it has become the currency of the future.”
The fact is that the world is rapidly changing and there is a need to stay ahead of the curve and stay relevant – and people are capable of accelerating things. Organisations were able to rapidly change governance and personalise information for customers and citizens.
Today, responsive citizen engagement is more important than ever. Organisations can deliver faster, more personalised and interactive experiences for citizens and other agency stakeholders with event streaming. “Data, and universal access to it, is the key to transforming organisations,” Mohit asserts.
Citing the example of Regeneron, Mohit points out that after developing a COVID-19 treatment in mere months, Regeneron adopted a data catalogue and is developing a data governance framework to speed up its drug development pipeline.
“Information and insights are all there only if you have data at the drop of a hat,” says Mohit. Empathically, he points out that while organisations are talking about the importance of Artificial Intelligence (AI), the key lies in utilising the new technology and truly embracing it.
“Habits are not shifting enough,” says Mohit. “The real challenge is getting people to truly understand how to make decisions through intelligence and not emotions.”
Mohit acknowledges that there are costs involved in having data in real-time but asserts that it is also the future. When the tools, people and technology are aligned, the big question is: what is the next step and how can organisations be more relevant?
In closing, Mohit urges delegates to partner with organisations that can help them strategise ways to leverage data. Data is the most essential ingredient and catalyst of our time and partnerships can allow organisations to transform their operations. Experts can assist organisations in delivering responsive citizen engagements and making their digital transformation journey smoother, cost-effective and impactful.
The value of connected data in digital transformation

Robin Fong, Regional Director – ASEAN, Neo4j spoke next on the use of graph data to contextualise and reveal connections, especially indirect connections among dispersed data.
“Having data and being able to run business intelligence and analytics is not enough,” Robin comments. “The next step is to be able to identify relationships.”
Data relationships create context and interrelationships create structure. Graph data adds context by capturing and storing relationships natively and processing them efficiently. That is how knowledge is created – when data is contextualised.
Governments and enterprises have been amassing lots of data and allocating large budgets to store them. It is now time to make sense of all the collected data to uncover hidden gems of insights, knowledge and wisdom by connecting them in a graph data platform.
Connectivity and networks require organisations to move from collected data to connected data, he explains. While organisations can take data and add some basic organising principles to create a knowledge base, the context is shallow and quickly ages because the underlying infrastructure is not built for relationships. However, if organisations can combine data, semantics and a graph structure, they will end up with a knowledge graph that has dynamic and very deep context because it is built around connected data.
Robin shares that Neo4j is in the business of helping the world make sense of the data. In fact, they are the founder of the graph data category and the world’s leading Graph Data Platform adopted by thousands of organisations globally. Moreover, Neo4j is the only Graph Data Platform vendor in the Govtech Data bulk tender for Data Science & AI.
Graphs are not new, Robin acknowledges. They are already deployed in a lot of situations today among leading companies – in banking and e-commerce. Graphs are extensively used across a wide range of sectors and use – fraud detection, supply chain management, customer experience, compliance and privacy management, personalisation and recommendations, employee or customer or patient or product 360, medicine research and cybersecurity.
Doctor.ai is a great use case from the healthcare industry as an example. Neo4j powers the Voice Chatbot for Doctor.ai in their work with Singapore Healthcare AI Datathon and EXPO 2021 with NUHS-NUS.
Graph Data enable fast access of patients to their private health records, monitor health and provide advice while it also creates alerts and makes doctor appointments. For doctors, Graph Data has enabled quick access to patients‘ health histories, assists in the decision-making process, makes machine learning predictions and pushes the newest research.
Robin strongly suggests delegates consider potential business problems/use cases where Connected Data (Graph Technology) may be useful and relevant. To get started, he provides steps on how organisations can get started and encourages delegates to contact Neo4j for a Discovery Workshop.
Before bringing the presentation to an end, he invited delegates to connect with him and the team if they would like to explore ways Neo4j can help and support agencies in transforming their organisation.
Harnessing Graph Data Technology in establishing a Smart Government

Damien Wong, Vice President, Asia Pacific & Japan, Confluent elaborated on the use of Graph Data in powering smart governments.
Event streaming is a real-time infrastructure revolution that is fundamentally changing how governments think about data and build applications. Rather than viewing data as stored records or transient messages, data could be considered to be a continually updating stream of events. Event-driven architecture is the future of data infrastructure.
“The world is changing,” Damien opines. “The world has changed for the current generation because technology is shaping how businesses need to respond to these changing expectations. The younger generation has never walked into a bank branch, and likely will never understand why anyone would ever need to do so since everything can be done online today.”
Most organisations today, are “becoming software.” Ride-hailing, he said, was an excellent example. Not too long ago, people needed a taxi, they would call a taxi dispatch service, wait for the ride to be confirmed and look out for the vehicle to arrive – there was no information on how long the taxi would take to arrive or the ETA to destinations. Today, all that information is given almost instantaneously on apps.
Today, software is the interface, Damien is convinced. It was not that it was not there before but rather than being an adjunct to the business, it has become the business. However, to make this transition, organisations have had to move on from relying solely on traditional data architectures. New architecture needs to be fast and responsive while batch processing has moved to real-time processing.
“Data systems need to be connected not treated in silos,” Damien emphasises. “In the new reality, services would be fast, in real-time and connected.”
This transformation is happening everywhere, and it is drastically causing people to rethink their approaches and systems:
- Cloud: Rethinking Data Centres
The cloud has changed how organisations think about data centres and running technical infrastructure. Today, every company is moving to the cloud.
- Machine Learning: Rethinking Decision Making
Machine learning has changed how decisions are being made, and this happens increasingly in an automated manner, driven by software that communicates to other software.
- Mobile: Rethinking User Experience
Mobile devices and internet connectivity have dramatically changed the user experience of how customers interact with organisations and have raised the bar for expectations.
- Data in Motion: Rethinking Data
Event streaming has changed how people think about and how people work with the data that underlies all the other trends.
“Data in Motion is the central nervous system for today’s enterprises,” he asserts. “And Apache Kafka is the event streaming technology powering Data in Motion.”
For Damien, the traditional use of data at rest is to consolidate data into a warehouse and apply analytics. Data in motion is, on the other hand, understanding the predefined actions that will be taken when encountering a specific event or data stream.
The rise of event streaming can be traced back to 2010 when Apache Kafka was created by the future Confluent founders in Silicon Valley. From there, Kafka began spreading throughout Silicon Valley and across the US West Coast. In 2014, Confluent was created to turn Kafka into an enterprise-ready software stack and cloud offering, after which the adoption of Kafka started to accelerate. Today, tens of thousands of companies across all kinds of industries the world over are using Kafka for event streaming.
If Kafka is the engine (the core technology), then Confluent is the ready-to-use product around that. Confluent is a natural candidate for real-time operations like command and control, cyber security and other anomaly detection solutions. It can enable event-driven architecture that helps modernise IT applications and hasten the addition of new citizen services or capabilities. Apart from that, data infrastructure for data in motion, Confluent will help organisations move towards multi- and hybrid- cloud and DR operations.
In conclusion, Damien encouraged delegates to consider some questions as they navigate through the paradigm shift:
- Are you looking to become a real-time smart agency? If so, how mature are you in leveraging data-in-motion platforms to support this?
- What are some of the use cases you’re implementing around this?
- Are there challenges that are holding you back from successfully making this transformation?
Damien affirmed the need for organisations to embrace the importance of real-time data if they want to stay relevant. Data in Motion is the ultimate key when it comes to delivering better services and empowering business missions.
Polling Results
Throughout the session, delegates were polled on different topics.
In the first poll, delegates were asked to vote on their priority in 2022. Half of the delegates indicated digital acceleration as their priority, followed by workforce transformation (33%) and tech modernisation (17%).
On what their biggest challenge was, a majority of the delegates (35%) indicated the lack of skilled staff who understand big data analysis. The remaining were split between the lack of quality data and proper data storage (30%), not able to synchronise disparate data sources (15%), not able to derive meaningful insights through data analytics (15%) and the inability to get voluminous data onto big data platform (5%).
Concerning the maturity of their organisations in using data and analytics, (38%) indicated that their organisations use performance dashboards to slice, dice and drill down. Other delegates indicated that they distribute static reports regularly (24.8%), combine data with predictive modelling, AI and machine learning techniques (24%) and use self-service analytics (14%).
The delegates were asked if they are familiar with the advantages of graph technology and how it will enhance their daily decision-making process. Just over half (53%) were familiar but are currently not using this it while about a quarter (26%) were not familiar but interested to know more. The remaining delegates are familiar and currently using this technology (21%)
On the common Data Integration/Connection challenge faced by delegates, most (35%) indicated disparate data formats and sources as the main challenge, while others expressed that low-quality or outdated data (29%) was. The remaining delegates face the challenge of data that isn’t available where it needs to be (24%), followed by the issue of having too much data (12%).
With regard to processing real-time data, most (65%) felt that they were emergent (some processes and knowledge, non-standardised), followed by limited: ad-hoc, unstructured, uncontrolled, reactive (29%), and structured: standardised, governance, scale, proactive (6%)
When asked about what would be important for a successful AI adoption in their organisation, an overwhelming majority (94%) indicated that starting small and building the business case by demonstrating initial wins would be important. The remaining delegates indicated aligning all departments on the single vision and garnering support (6%)
Inquiring about being the essential tenet for ethical AI to work, most delegates (40%) believe in the need for an effective and practical ethical framework/ Governance model for AI. The other delegates were split between AI solutions that should allow for auditability and traceability (26.7%), guaranteeing privacy by design in machine learning systems (26.7%), and the iimportance of training AI models with carefully-assessed and representative data (6.7%).
In the final poll for the morning session, delegates were asked what they would invest in, if they had an unlimited budget. Just over a third (35%) said they would spend on integrating disparate systems, followed by spending on resources to improve delivery timeline (29%), updating legacy technologies (18%), improving security and compliance (12%) and staff training / upskilling (6%).
Afternoon Session
Data Virtualisation in supporting advanced analytics

Elaine Chan, Regional Vice President Sales – ASEAN & Korea, Denodo spoke about how data virtualisation can help with advanced analytics and cloud modernisation.
As data analytics and data-driven intelligence take centre stage in today’s digital economy, logical data integration across the widest variety of data sources, with proper security and governance structure in place has become mission-critical.
Based on a Denodo Global Cloud Survey 2021, cloud adoption is on the rise with a 25% increase year-over-year in advanced cloud workloads. This indicates that more complex workloads are moving to the cloud and that COVID-19 has perhaps driven that increase.
Today, the hybrid cloud model remains in the lead, with more than one-third of users leveraging that architecture. Private cloud also saw some good gains, with nearly 25% of their workloads still being run on-premises.
One of the key benefits that cloud technologies provide is the ability to scale faster, although performance and ease of data management also provide strong benefits, identified by 31% and 20% of participants, respectively.
Data Virtualisation allows for flexibility, access from anywhere and lowers the costs of operations. However, there are also concerns about how the transition to cloud might create new data silos, security and latency.
Elaine believes that there is a need for logical data architecture. “Data Fabric is the best path to data management automation,” Elaine opines. In layman terms, it can be broken down as follows:
- “Integrate data” from disparate data sources, on-prem and in the cloud
- Securely deliver an “integrated view” of the different data objects
- Consume the “integrated data” for analytics and operational purposes
- Automate the entire process using AI/ML
According to Elaine, Denodo logical data fabric sits between the data sources and the consumers and have a few characteristics:
- Unified Data Integration and Delivery
- Allows reusing existing analytics systems
- Allows using the best system for each need
- Abstraction: No Lock-In
- Evolve / Optimise infrastructure without affecting data consumers
- Dramatically Increased Productivity
- Minimise data replication: virtual or smart, selective data replication
Breaking down the essential capabilities of data virtualisation, Elaine highlights five aspects
- Data Abstraction: Decoupling applications and data usage from data sources and infrastructure
- Zero Replication, Zero Relocation: Physical data remains where they are
- Real-Time Information: Most reporting and analytical tools can easily connect for real-time data
- Self Service Data Marketplace: A Dynamic Data Catalogue for self-service data discovery and data services available in the virtualisation layer
- Centralised Metadata, Security & Governance: Manage access across all data assets in the Virtualisation layer for enterprise data security and supports dynamic data anonymisation
- Location-agnostic Architecture: For hybrid and multi-cloud acceleration
Delving into the use case of Statistics Netherlands, Elaine elaborated about the requirements that the data management team was looking for:
- Create tailored reports for government agencies that want to change public policies for people who need extra support
- Add new data sources without affecting the continuity of other public service agencies and at the same time making them more agile in the process
- Expand the data services supporting more teams without increasing infrastructure costs for storage and servers.
With Denodo, the logical data warehouse created using data virtualisation enabled Statistics Netherlands to create one access point to explore and access all data, bringing data to its fingertips. It also created a self-service culture for data consumers that is easy to use, while enabling Statistics Netherlands to implement security and governance by centralising authentication and authorisation.
Summing up the presentation, Elaine pointed out good infrastructure in place is necessary to support more advanced analytics. Data virtualisation helps to complete enterprise information, combining Web, cloud, streaming, and structured data. It promises ROI realisation within 6 months, with the flexibility to adjust to unforeseen changes, and an 80% reduction in integration costs, in terms of resources and technology. Most importantly, there is real-time integration and data access, enabling faster business decisions.
She encouraged delegates to reach out to her directly if they have any queries about the journey towards data virtualisation.
Generating incisive insights through Graph technology

Tony Tan, Co-Founder & Deputy Chief Executive Officer, Imperium Solutions spoke about why graph analysis is possibly the single most effective competitive differentiator for organisations pursuing data-driven operations and decisions after the design of data capture.
“Optimising supply chain is tricky,” Tony opens. “Even after over 50 years with billions invested and R&Ds and building of complex ERP systems, and advancements in operation management, we are still facing a supply chain problem.”
For example, in Singapore, many people like to own cars, but the recent BMW models do not come with touch screens, satellite radios, digital keys and the stop-and-start engine. Manufacturers are good at supplying first-tier suppliers but many of the problems are further downstream.
This begs the question – is there a technology that will bring everyone closer to solving issues? And if so, where are the opportunities in the bottlenecks? Tony believes that creating a breakthrough in solving recurring issues requires methods outside of what has been tried.
Drawing a parallel to the issues with the supply chain, Tony says that fraud has many facades. PWC published a report last year based on a survey they conducted with over 5,000 respondents between 2019-2020. They claimed that 42 billion dollars were lost in financial fraud.
The majority of this is based on 4 types of customer fraud – cybercrime, asset misappropriation and bribery/corruption. However, there are others such as accounting fraud, procurement fraud, deceptive business practices, AML / sanctions, tax, IP theft and anti-trust. The problems are aplenty, Tony claims, which takes up time and energy investment to resolve.
Problems also abound in the metaverse. While blockchain is here to stay, Tony feels, and decentralised finance enables the open and transparent exchange of digital currency. However, such a new system, unregulated, can also be a breeding ground for criminals and hackers, ripe for exploitation. With scammers on the rise, there is a need to establish relationships between users to identify scammers more efficiently.
Tony acknowledges that technology is the key towards solving many of the issues that companies faced – and data is at the centre of it. The operations of major companies, Linkedin, Google, Netflix and as well as the largest bank in the US are powered by Graph Technology. Gartner says it is the top 8 technologies of the near future.
With Graph Technology, relationships between data points are established which enables people to swiftly locate information and redefines the way we are looking at data today. It allows going deep into the relationship and can be used for a variety of problems and domains such as:
- Companies, markets
- Countries, history, politics
- Sciences, art, teaching
- Technology, networks, machines, applications, users
- Software, code, dependencies, architecture
- Criminals, fraudsters, terrorists
TigerGraph is currently deployed by 8 of the largest banks in the world, including Goldman Sachs, JPMorgan, Bank of America, and ICBC (China).
“The time to use graph is today,” Tony says. To face mounting challenges, there is a real need to harness the insights through graph technology which can amplify the connected data.
Polling Results for Afternoon Session
Throughout the afternoon session, delegates were polled on different topics.
In the first poll, delegates were asked to vote on their priority in 2022. Most of the delegates (48%) indicated digital acceleration as their priority, followed by tech modernisation (30%) and workforce transformation (22%)
When asked about what their biggest challenge is, a third (33%) indicated the lack of skilled staff who understand big data analysis as the biggest challenge. The remaining votes were distributed between not being able to synchronise disparate data sources (29%), the lack of quality data and proper data storage (17%), not able to derive meaningful insights through data analytics (8%) and the inability to get voluminous data onto big data platform (13%).
On organisation maturity in using data and analytics, a majority (41%) indicated that their organisations use performance dashboards to slice, dice and drill down. Other delegates have embedded visualisation into our process and transactional systems (23%), distribute static reports regularly (18%), combine data with predictive modelling, AI and machine learning techniques (12%) and use self-service analytics (6%).
The delegates were also asked if they are familiar with the advantages of graph technology and how it will enhance their daily decision-making process. Most (40%) were familiar but are currently not using this it while others are familiar and currently using this technology (33%) and the rest were not familiar but interested to know more (27%).
On the common Data Integration/Connection challenge faced by delegates, just over half (52%) indicated disparate data formats and sources as the main challenge, while others (18%) expressed that low-quality or outdated data was. The remaining delegates face the challenge of data that isn’t available where it needs to be (12%), followed by the issue of having too much data (12%) and the use of wrong integration software (6%).
On the maturity of their organisations in processing real-time data, the majority (44%) felt that they were emergent (some processes and knowledge, non-standardised). The rest were split between limited: ad-hoc, unstructured, uncontrolled, reactive (28%), and structured: standardised, governance, scale, proactive (28%).
When asked about what would be important for a successful AI adoption in their organisation, a huge majority (65%) indicated that starting small and building the business case by demonstrating initial wins would be important. The remaining delegates were split between aligning all departments on the single vision and garnering support as important (17.5%) and establishing clear lines of authority and ownership across the entire organisation (17.5%)
Asked about the essential tenet for ethical AI to work, about half (52%) believe in the need for an effective and practical ethical framework/ Governance model for AI. The others were split between the belief that AI solutions should allow for auditability and traceability (22%), the importance of training AI models with carefully-assessed and representative data (17%) and guaranteeing privacy by design in machine learning systems (9%).
In the final poll for the morning session, delegates were asked what they would invest in, if they had an unlimited budget. The majority of the delegates (35%) would spend on updating legacy technologies, followed by spending on resources to improve delivery timeline (36%), integrating disparate systems (21%), and staff training / upskilling (8%).
Closing
To conclude the day, Mohit emphasised the importance of understanding and harnessing data to derive insights that will help organisations stand out among competitors. Data is the new future that can help to improve services for an increasingly data-driven world.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The National Heart Centre Singapore (NHCS) has been on a remarkable journey of advancements in cardiovascular research, particularly in the prevention, diagnosis, and management of heart diseases. With the global rise in heart disease cases, NHCS’s dedication to scientific knowledge and innovation has become increasingly vital.
Since its establishment in 2014, the National Heart Research Institute of Singapore (NHRIS) at NHCS has positioned itself as a leading institution for cardiovascular research in the region. Over the years, NHRIS has achieved significant breakthroughs that hold the potential to transform patient outcomes.
NHRIS’s research encompasses a wide spectrum of disciplines within cardiovascular medicine, spanning basic, translational, and clinical research. Notable achievements include Heart Stem Cell Therapy and Preventing Fibrosis.
By studying patients’ heart stem cells, researchers have uncovered new treatments for heart diseases. For example, a breakthrough treatment using myeloperoxidase has been discovered for hypertrophic cardiomyopathy, an inherited condition characterised by thickening of the heart muscle.
Also, through the study of heart tissue from patients undergoing surgery, NHRIS researchers have identified a potential treatment involving interleukin-11 antibodies to prevent inflammation and fibrosis in the heart and other organs. This innovative therapy has the potential to improve outcomes for patients with various inflammatory and fibrotic conditions.
The next phase of NHCS’s research efforts over the coming years will focus on three key areas:
- Discovery of New Treatments: Ongoing research aims to develop new treatments for heart diseases, enhancing patient outcomes.
- Utilising Artificial Intelligence: NHCS is at the forefront of integrating artificial intelligence (AI) into cardiovascular care. AI holds promise in predicting, diagnosing, and monitoring heart diseases with greater precision and efficiency. The APOLLO study, initiated in 2021, is building an AI-driven national platform for coronary angiography analysis, offering detailed reports on patients’ conditions and future cardiovascular disease risk.
- Clinical Trials and Population Health Studies: NHCS’s research agenda includes conducting clinical trials and population health studies to prevent the onset of heart disease.
NHRIS is pioneering innovative approaches, including Visualising Energy Pathways and AI Applications.
Disturbances in energy-producing pathways in heart muscle contribute to heart conditions as Hyperpolarised magnetic resonance spectroscopy, a novel imaging technology available only in a few centres worldwide, allows the measurement of these metabolic pathways, potentially leading to new treatments for heart disease.
On the other hand, AI accelerates research in the field of cardiovascular science. By processing vast datasets and identifying patterns, AI systems assist researchers in identifying novel treatment methods, risk factors, and disease mechanisms. These insights lead to breakthroughs in treatment and prevention methods, advancing the overall understanding of cardiovascular diseases.
With this, NHCS is leveraging AI to detect, predict, and diagnose heart diseases by analysing complex imaging data. AI provides clinicians with invaluable insights, enabling personalised care and early intervention.
In addition, NHCS collaborates with other heart research institutes and hospitals through CADENCE (Cardiovascular Disease National Collaborative Enterprise), a national platform that combines heart research capabilities in data science, clinical trials, and AI. This collaboration ensures a collective effort to advance cardiovascular research and improve patient care.
NHCS’s groundbreaking research initiatives in AI applications, clinical trials, and collaborative efforts underscore its commitment to enhancing patient care. As NHCS continues its pursuit of research excellence, its impact extends beyond Singapore, benefiting individuals across the region and around the world. The institution is poised to make substantial progress in preventing, diagnosing, and managing cardiovascular diseases, ultimately reshaping the future of cardiovascular medicine.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
An innovative microscope developed by a research team at the Hong Kong University of Science and Technology (HKUST) is poised to revolutionise the field of cancer surgery. This cutting-edge microscope, powered by artificial intelligence, has the potential to transform the way surgeons detect and remove cancerous tissue during operations, thereby sparing patients from the distressing prospect of secondary surgeries.
Lung cancer, a leading cause of cancer-related deaths worldwide, has been a focal point for this ground-breaking research. Professor Terence Wong Tsz-Wai, the principal investigator of the project and an assistant professor in the Department of Chemical and Biological Engineering at HKUST, highlights the urgency of their work.

He notes that between 10% to 20% of lung cancer surgery cases require patients to return for a second operation due to incomplete removal of cancer cells. This uncertainty has long plagued surgeons, who often struggle to determine if they’ve successfully excised all cancerous tissue during the initial surgery.
The HKUST research team, led by Prof. Wong, is eager to see their innovation make a significant impact. Collaborating with five hospitals, including Queen Mary Hospital, Prince of Wales Hospital in Hong Kong, and three mainland Chinese hospitals, they have embarked on a large-scale clinical trial involving around 1,000 patient tissue samples. The goal is to have the microscope officially in service locally by 2024 and on the mainland by 2025.
The current methods for imaging cancer tissue offer either accuracy with lengthy delays or speed at the cost of accuracy. Traditional microscopy, considered the gold standard, is highly accurate but can take up to a week to generate results. This means patients must endure a week of anxious waiting to know the outcome of their surgery. In cases where the operation is deemed unsuccessful, patients face the daunting prospect of a second surgery to remove the remaining cancer cells.
The alternative, known as the frozen section, provides quicker results within 30 minutes but sacrifices accuracy, with an estimated accuracy rate of only around 70%.
The HKUST research team’s breakthrough technology, termed “Computational High-throughput Autofluorescence Microscopy by Pattern Illumination” (CHAMP), has changed this landscape. It can detect cancer cells in just three minutes with an accuracy rate exceeding 90%, rivalling the gold standard but with significantly faster results.
CHAMP employs ultraviolet (UV) light excitation to image tissue surfaces at a specific wavelength. Subsequently, a deep learning algorithm transforms the obtained greyscale image into a histological image, facilitating instant interpretation by doctors. This real-time feedback empowers surgeons to ensure they have completely removed all cancer cells during the operation.
CHAMP’s potential has garnered local, regional, and international acclaim, leading to the establishment of a start-up supported by HKUST and funded by the Technology Start-up Support Scheme for Universities (TSSSU). Beyond developing the technology, the company plans to manufacture CHAMP microscopes for medical institutions in Hong Kong, mainland China, and overseas markets.
This endeavour represents the culmination of years of meticulous research, starting with Prof. Wong’s PhD training at Washington University in St. Louis and the California Institute of Technology. During this period, Prof. Wong, under the guidance of biomedical imaging expert Prof. Lihong Wang, developed a microscope capable of analysing breast cancer tumours with an accuracy rate comparable to the gold standard but with results in just one to two hours.
The shift in focus to lung cancer occurred when a pulmonologist approached Prof. Wong, recognising the potential of the technology to enhance precision during lung cancer surgery. This decision led to the development of CHAMP microscopy, which is approximately 100 times faster than Prof. Wong’s earlier work during his PhD training. This breakthrough makes CHAMP clinically useful and impactful.
The applications of CHAMP extend beyond lung and breast cancers. The research team is conducting tests on smaller scales for conditions such as liver, colorectal, kidney, and skin cancers, as well as prostate gland conditions. Prof. Wong is confident that CHAMP will elevate medical imaging and diagnosis to new heights, benefiting not only Hong Kong hospitals but also healthcare institutions nationwide and abroad. This pioneering technology represents a beacon of hope for cancer patients, offering the promise of quicker, more accurate surgeries and improved outcomes.
OpenGov Asia reported that the Hong Kong Science and Technology Parks Corporation (HKSTP) spearheaded an initiative aimed at promoting innovation and technology in the biotech sector, showcasing Hong Kong’s pioneering advancements and entrepreneurial spirit.
This initiative was part of the “Think Business, Think Hong Kong” event organised by the Hong Kong Trade Development Council (HKTDC) in Paris recently. The event was a platform to underscore the potential for cross-border collaboration between Hong Kong and France in the field of biotechnology and innovation.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The government has unveiled the Intelligent Grievance Monitoring System (IGMS) 2.0 Public Grievance Portal and Automated Analysis in the Tree Dashboard portal under the Department of Administrative Reforms and Public Grievances (DARPG). It was unveiled by Jitendra Singh, the Union Minister of State (Independent Charge) for Science and Technology.
The IGMS 2.0 Dashboard was developed by the Indian Institute of Technology, Kanpur (IIT-Kanpur) as part of an agreement with the DARPG through a memorandum of understanding (MoU) signed in 2021. It enhances DARPG’s Centralised Public Grievance Redress and Monitoring System Information Systems (CPGRAMS) by integrating artificial intelligence (AI) capabilities. CPGRAMS is an online platform available to citizens round-the-clock to lodge their grievances to the public authorities on any subject related to service delivery.

The dashboard offers instant tabular analyses of both grievances filed and disposed of. It provides data categorised by state and district for grievances filed, and it also offers Ministry-wise data. Additionally, the dashboard can help officials identify the root causes of grievances.
The CPGRAMS portal receives an increasingly high caseload of issues raised by the general public. Given the public’s expectations for the timely resolution of their grievances, the portal receives approximately 2 million grievances annually.
Due to the substantial volume of grievances received, the manual classification and monitoring of cases is not feasible. The IGMS portal will assist the DARPG in generating draft letters for specific schemes or ministries. This automation expedites the grievance redressal process carried out by the respective ministries and departments involved.
According to Minister Singh, the Prime Minister has repeatedly emphasised the significance of grievance redressal as a crucial element to keep the government accountable and promote citizen-centric governance. In alignment with this vision, a more robust human interface mechanism has been introduced, which includes counselling services provided after the resolution of grievances.
The Minister praised DARPG for ensuring that the CPGRAMS portal is accessible in 22 Scheduled languages, in addition to English, ensuring that the benefits of the portal are accessible to the common man. He also emphasised the importance of integrating state public grievance (PG) portals and other government portals with CPGRAMS for more effective and streamlined grievance redressal processes.
He claimed that thanks to the reforms implemented by DARPG in the CPGRAMS, the average time it takes for central ministries and departments to resolve public grievances has decreased. There has been a decline of almost 50% in the average disposal time for central ministries and departments from 32 days in 2021 to 18 days in 2023.
Minister Singh also launched the Swachhata Special Campaign 3.0 and unveiled the Precedent Book (e-book) developed by the department. He praised the DARPG for achieving the transition to a fully paperless office, where all communication is conducted through the eOffice portal.
During the past two Swachhata campaigns, an impressive 9 million square feet of prime office space has been successfully cleared and repurposed for productive use. Additionally, 456,000 public grievances have been effectively redressed, and 8,998 references from Members of Parliament (MPs) have been addressed. The Swachhata campaign has also played a pivotal role in promoting an eOffice work culture within the government, resulting in over 90% of file work being transitioned to an online format.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Public transportation is a crucial service for enhancing the general satisfaction the government provides. In light of this, the Indonesian government has established high-speed rail infrastructure for Jakarta-Bandung mobility.
The Ministry of Communication and Information Technology (Kominfo) fully supports the Jakarta-Bandung High-Speed Train (KCJB) WHOOSH operation. Kominfo’s Budi Arie Setiadi expressed continuous monitoring for the availability and reliability of digital connectivity, particularly telecommunications networks along the first high-speed rail route in Indonesia.
“We, along with the telecommunications ecosystem, conducted tests. Kominfo is tasked with supporting signal-related issues. We assessed the signal quality along our journey and found that we could use devices and frequencies for communication,” he explained.
Minister Budi Arie emphasised that KCJB, as a technological leap for Indonesia’s progress, needs full support from the latest telecommunications technology. With advancements in transportation paralleled by digital technology, it will undoubtedly facilitate more efficient access for the public.
“This is a technological leap for Indonesia’s progress. Because this train is solid, the tracks are seamless, and the signal is robust. Our duty and responsibility are to support it,” he added.
Kominfo assured that the quality of telecommunications services would sustain the overall KCJB service. According to them, the journey from KCJB Halim Station to KCJB Padalarang Station and vice versa proceeded smoothly.
“Overall, the management and governance of the high-speed train are excellent,” he noted.
At this trial event, Minister Budi Arie Setiadi was joined by Deputy Minister of Kominfo Nezar Patria and senior officials from the Ministry of Communication and Information Technology. Minister Budi Arie encouraged the telecommunications service provider network to oversee and guarantee the quality of the network.
Ismail, the Director-General of Resources and Equipment of Posts and Information Technology at Kominfo, explained that the test conducted by Kominfo officials and telecommunications service providers is part of the initial process to support digital connectivity for KCJB. Kominfo has prepared radio frequency spectra for quality telecommunications signal transmission.
“And, fortunately, the signal used, or the frequency used, is now in collaboration with one of the biggest telecommunication companies in Indonesia. This cooperation began about two or three years ago. And, thank God, we witnessed today that the train’s communication system worked well. No signal interruptions,” he stated.
Director-General Ismail states that 5G telecommunication networks are available at Halim KCJB Station and Padalarang KCJB Station. This network supports connectivity and signifies that Indonesia is ready for full-scale and comprehensive digital transformation, even in minor details.
“For these two station locations here (Halim) and in Padalarang, the 5G signal has already been covered. Passengers at these stations can now enjoy 5G services. The remaining task is to improve the signal for passengers during the journey. So, from Jakarta to Padalarang and Bandung, we hope there will be no frequency or cellular signal interruptions,” he explained.
Next, Henry Mulya Syam, the President and Director of the Telecommunication company, stated that they would address several remaining telecommunications service challenges at various points along the KCJB route.
“There are several sites to be added, both outdoor and on the KCJB panel. We have conducted evaluations, so hopefully, within 6 to 9 months, because new towers need to be built,” he clarified.
Previously, together with President Joko Widodo and several members of the Indonesia Maju Cabinet, Minister of Communication and Information Technology Budi Arie Setiadi conducted a test journey on the KCJB from Halim Station, East Jakarta, to Padalarang Station, West Bandung Regency. The KCJB, WHOOSH, travels 350 kilometres per hour, making it the first high-speed train in Indonesia and Southeast Asia.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Oak Ridge National Laboratory (ORNL) has introduced the Centre for AI Security Research (CAISER) to confront the existing threats stemming from the widespread adoption of artificial intelligence by governments and industries worldwide. This move concedes the potential benefits of AI in data processing, operational streamlining, and decision-making while acknowledging the associated security challenges.
ORNL and CAISER will collaborate with federal agencies such as the Air Force Research Laboratory’s Information Directorate and the Department of Homeland Security Science and Technology Directorate. Together, they will conduct a comprehensive scientific analysis to assess the vulnerabilities, threats, and risks associated with emerging and advanced artificial intelligence, addressing concerns ranging from individual privacy to international security.
Susan Hubbard, Deputy for Science and Technology at ORNL, emphasised this endeavour, “Understanding AI vulnerabilities and risks represents one of the most significant scientific challenges of our time. ORNL is at the forefront of advancing AI to tackle critical scientific issues for the Department of Energy, and we are confident that our laboratory can assist DOE and other federal partners in addressing crucial AI security questions, all while providing valuable insights to policymakers and the general public.”
CAISER represents an expansion of ORNL’s ongoing Artificial Intelligence for Science and National Security initiative, which leverages the laboratory’s unique capabilities, infrastructure, and data to accelerate scientific advancements.
Prasanna Balaprakash, Director of AI Programmes at ORNL, emphasised that AI technologies substantially benefit the public and government. CAISER aims to apply the lab’s expertise to comprehensively understand threats and ensure AI’s safe and secure utilisation.
Previous research has highlighted vulnerabilities in AI systems, including the potential for adversarial attacks that can corrupt AI models, manipulate output, or deceive detection algorithms. Additionally, generative AI technologies can generate convincing deepfake content.
Edmon Begoli, Head of ORNL’s Advanced Intelligent Systems section and CAISER’s founding director emphasised the importance of addressing AI vulnerabilities. CAISER aims to pioneer AI security research, developing strategies and solutions to mitigate emerging risks.
CAISER’s research endeavours will provide federal partners with a science-based understanding of AI risks and effective mitigation strategies, ensuring the reliability and resilience of AI tools against adversarial threats.
They provide educational outreach and disseminate information to inform the public, policymakers, and the national security community.
CAISER’s initial focus revolves around four national security domains aligned with ORNL’s strengths: AI for cybersecurity, biometrics, geospatial intelligence, and nuclear nonproliferation. Collaboration with national security and industry partners is critical to these efforts.
Col Fred Garcia, Director of the Air Force Research Laboratory (AFRL) Information Directorate, expressed confidence in CAISER’s role in studying AI vulnerabilities and safeguarding against potential threats in an AI-driven world.
Moreover, as ORNL celebrates its 80th anniversary, CAISER embodies the laboratory’s commitment to solving complex challenges, advancing emerging scientific fields, and making a global impact. With its established cybersecurity and AI research programmes, ORNL is well-suited to pioneer AI security research through CAISER.
Moe Khaleel, Associated Laboratory Director for National Security Sciences at ORNL, highlighted the laboratory’s legacy of scientific discovery in various fields and emphasised CAISER’s role in scientifically observing, analysing and evaluating AI models to meet national security needs.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Digital Government Development Agency (DGA) recently updated Thailand’s digital government progress to enhance nationwide digital services. They plan to expand their government application for all age groups, with over 400 million digital service usages, excluding infrastructure services.
The estimated economic value exceeds 8 billion baht. Their strategy focuses on more accessible, faster, and transparent access to government services, fostering a Smart Connector role. This enhances digital government levels, promoting a Smart Nation and Smart Life for Thai citizens, aligning with their quality of life improvement goals. Dr Supot Tiarawut, Director of DGA, presented these 2023 mission results, emphasising their commitment to effectively serving citizens, businesses, and government entities.
At the Government-to-Citizens (G2C) level, the DGA has linked over 112 government services via the government application, functioning as a comprehensive government SUPER APP. This app integrates services from various government agencies to address citizens’ needs effectively. It boasts more than 112 services, with over 7.5 million cumulative users and 607,041 downloads. This offers citizens a convenient single-channel solution for accessing government services, streamlining the process for all age groups and reducing the complexities associated with traditional government service usage. The plan for 2024 involves introducing critical services such as personal land tax checks, insurance information (Life/Non-Life), and interest payment services (pawning).
The Government Open Data Centre elevation aims to provide high-quality open datasets that cater to the populace’s needs and serve software developers, enabling their appropriate and optimal utilisation. This strategic move aims to enhance future competitiveness. Currently, there are 10,226 open datasets with 3,871,796 users.
The plan for 2024 includes boosting information exchange and utilisation among the public, private, and international sectors. Additionally, the Digital Transcript project, which offers digital transcripts, enhances convenience for students, reduces financial burdens, eases document verification processes for staff, and trims university expenditure on document issuance. This initiative has already produced over 1 million cards across 82 universities nationwide.
The DGA promotes transparency and public engagement through the central legal system, where the government seeks general feedback on law drafts and assesses their effectiveness. Over 1,000 regulations have been open for public comment, with 191,683 submissions. Additionally, the Tax Pai Pai system, providing government expenditure data, enhances public participation in monitoring corruption, with 16,187,604 projects disclosed.
In the G2B sector, the Biz Portal streamlines government-business interactions, benefiting SMEs. Over 124 government licenses have been obtained by 15,881 active operators, simplifying business startup processes. The Digital Entrepreneur Centre for Government Agencies (Me-D e-Marketplace) lists 595 digital technology entrepreneurs from various agencies for government procurement.
In G2G collaboration, the DGA enhances data sharing through the Government Data Exchange Centre (GDX), linking 13 agencies through 74 service data APIs with 133.44 million data exchanges. The Digital Government Personnel Development Institute (TDGA) has already benefited over 1,942,443 individuals, with plans to expand to local-level staff in 2024, offering region-specific digital courses and on-site training through the system with over 300,000 learners.
The Digital Local System is a crucial initiative, a cornerstone of local-level digital government adoption. It streamlines the administration and services of 659 Local Administrative Organisations, incorporating systems from 117 agencies. This enhances service provision, making it accessible and convenient nationwide, ultimately improving people’s quality of life in various regions.
During a visit to Bang Saray Subdistrict Municipality in Chonburi Province, the DGA observed the successful Digital Local System pilot project, which enables convenient access to services, reducing the need for physical visits to government offices and improving efficiency and cost-effectiveness. The initiative also established B-Buddy Bang Saray, a network of volunteers aiding those unfamiliar with digital systems to promote inclusivity.
In his closing remarks, Dr Supot highlighted these projects as examples of the DGA’s role in advancing Thailand towards becoming a Smart Nation, enhancing citizens’ quality of life. These efforts have consistently improved Thailand’s digital government development rankings assessed by the United Nations.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Government agencies in New Zealand are entering the digital age by launching their new Government Electronic Tender Service (GETS) and All-of-Government (AoG) collaborative contracts dashboards. These innovative digital tools are set to revolutionise procurement practices, offering unprecedented insights into spending patterns and benchmarking features.
The GETS and AoG dashboards have been developed with a digital-first approach to provide agencies with comprehensive insights into their procurement practices. One of the key goals of these dashboards is to enhance transparency in government spending, allowing agencies to make more informed decisions and facilitating strategic, intelligence-led procurement processes.
The GETS and AoG dashboards leverage cutting-edge data visualisation technologies to present complex procurement data in a clear and accessible manner. Interactive charts, graphs, and visual representations make it easier for users to gain insights from the data, promoting better decision-making.
Early agency feedback has been positive, with many highlighting the value of the benchmarking features. These features enable agencies to compare their procurement practices with others, fostering healthy competition and sharing best practices. This benchmarking capability not only improves transparency but also helps agencies identify areas for improvement.
One of the core objectives of this initiative is to make the dashboards even more user-friendly and comprehensive in future versions. The development team aims to streamline the user experience, making it easier for agencies to access and interpret the available data. Additionally, the dashboards will be expanded to include data from all participating agencies, further enhancing procurement data transparency.
In the pursuit of transparency and efficiency, government agencies actively seek input from users and stakeholders. They have invited agencies and individuals to share their suggestions and ideas on improving the dashboards. This collaborative approach ensures that the tools meet the needs of agencies and the broader public, fostering a culture of continuous improvement.
Moreover, this new GETS commits to making the dashboards more user-friendly and reflects a user-centric design approach. Agencies will likely collaborate with UX designers to ensure the dashboards are intuitive and tailored to users’ needs, ultimately improving the overall user experience.
Implementing a user-friendly UX is not only making a profound statement about the New Zealand government’s commitment to improving public services but also acknowledging that the success of these dashboards hinges on their adoption and utilisation by a diverse user base. In government procurement, where various stakeholders, including procurement officers, administrators, and policymakers, interact with these tools, catering to their varied needs is paramount.
It will also employ artificial intelligence (AI) to provide intelligent insights. With the emergence of technology, the roles of AI algorithms can be analysed deeper and more accurately. It can generate historical spending data and suggest trends, helping agencies identify cost-saving opportunities and optimise procurement strategies.
The GETS and AoG dashboards represent a significant milestone as government agencies continue their digital transformation journey. These tools provide a glimpse into the future of procurement practices, where data-driven decisions and transparency take centre stage. With ongoing efforts to improve user-friendliness and expand data coverage, these dashboards will play a pivotal role in shaping the procurement landscape for years to come.
In the era of digital government, the commitment to harnessing technology for improved governance and public service is evident. As agencies embrace innovative digital tools, the government sets a precedent for other sectors, fostering a culture of digital innovation and data-driven decision-making for the New Zealand government.