Tuning Agile Delivery for Customer and Employee Success

Ashish Joseph

What is Agile?

Agile has been very popular in the software development industry for empowering delivery to be more efficient and effective. It is a common misconception for Agile to be thought of as a framework or a process that follows a methodology for software development. But Agile is a set of values and principles. It is a collection of beliefs that teams can use for decision making and optimizing project deliveries. It is customer-centric and flexible, helping teams adapt accordingly. It doesn’t make the decision for the team. Instead, it gives a foundation for teams to make decisions that can result in a stellar execution of the project.

According to the Agile Manifesto, teams can deliver better by prioritizing the following over the other.

  • Individuals and Interactions over process and tools
  • Working Model over Comprehensive Documentation
  • Customer Collaboration over Contract Negotiation
  • Responding to Changes over following a Plan

With respect to Software Development, Agile is an iterative approach to project management which help teams deliver results with measurable customer value. The approach is designed to be faster and ensures the quality of delivery that is aided with periodic customer feedbacks. Agile aims to break down the requirement into smaller portions, results of which can be continuously evaluated with a natural mechanism to respond to changes quickly.

AIOps Artificial Intelligence for IT Operations

Why Agile?

The world is changing, and businesses must be ready to adapt to how the market demands change over time. When we look at the Fortune 500 companies from 1955, 88% of them perished. Nearly half of the S&P 500 companies is forecasted to be replaced every ten years. The only way for organizations to survive is to innovate continuously and understand the pulse of the market every step of the way. An innovative mindset helps organizations react to changes and discover new opportunities the market can offer them from time to time.

Agile helps organizations execute projects in an everchanging environment. The approach helps break down modules for continuous customer evaluation and implement changes swiftly.

The traditional approach to software project management uses the waterfall model, where we Plan, Build, Test, Review and Deploy. But this existing approach would result in iterations in the plan phase whenever there are deviations in the requirement with respect to the market. When teams choose agile, it helps them respond to changes in the marketplace and implement customer feedback without going off the plan. Agile plans are designed in such a manner to include continuous feedback and its corresponding changes. Organizations should imbibe the ability to adapt and respond fast to new and changing market demands. This foundation is imperative for modern software development and delivery.

Is Agile a right fit for my Customer? People who advocate Agile development claim that Agile projects succeed more often than waterfall delivery models. But this claim has not been validated by statistics. A paper titled “How Agile your Project should be?” by Dr. Kevin Thompson from Kevin Thompson Consulting, provides a perspective from a mathematical point of view for both Agile and Waterfall project management. Here both approaches were followed for the same requirements and were also affected by the same unanticipated variables. The paper focused on the statistical evidence to support the validity of both the options to evaluate the fit.

While assessing the right approach, the following questions need to be asked

  • Are the customer requirements for the project complete, clear and stable?
  • Can the project effort estimation be easily predicted?
  • Has a project with similar requirements been executed before?

If the answer to all the above questions are Yes, then Agile is not the approach to be followed.

The Agile approach provides a better return on investment and risk reduction when there is high uncertainty of different variables in the project. When the uncertainty is low, waterfall projects tend to be more cost effective than agile projects.

Optimizing Agile Customer Centricity

Customer centricity should be the foundation of all project deliveries. This help businesses align themselves to the customer’s mission and vision with respect to the project at hand. While we consider an Agile approach to a project in a dynamic and changing environment, the following are some principles that can help organizations align themselves better with their customer goals.

  • Prioritizing Customer Satisfaction through timely and continuous delivery of requirements.
  • Openness to changing requirements, regardless of the development phase, to enable customers to harness the change for their competitive advantage in the market.
  • Frequent delivery of modules with a preference towards shorter timelines.
  • Continuous collaboration between management and developers to understand the functional and non-functional requirements better.
  • Measuring progress through the number of working modules delivered.
  • Improving velocity and agility in delivery by concentrating on technical excellence and good design.
  • Periodic retrospection at the end of each sprint to improve delivery effectiveness and efficiency.
  • Trusting and supporting motivated individuals to lead projects on their own and allowing them to experiment.

Since Agile is a collection of principles and values, its real utility lies in giving teams a common foundation to make good decisions with actionable intelligence to deliver measurable value to their customers.

Agile Empowered Employee Success

A truly Agile team makes their decisions based on Agile values and principles. The values and principles have enough flexibility to allow teams to develop software in the ways that work best for their market situation while providing enough direction to help them to continually move towards their full potential. The team and employee empowerment through these values and principles aid in the overall performance.

Agile not only improves the team but also the environment around which it is established by helping employees to be compliant with respect to audit and governance.  It reduces the overall project cost for dynamic requirements and focuses on technical excellence along with an optimized process for its delivery. The 14th Annual State of Agile Report 2020 published by StateofAgile.com surveyed 40,000 Agile executives to get insights into the application of Agile across different areas of enterprises. The report surveyed different Agile techniques that contributed most towards the employee success of the organization. The following are some of the most preferred Agile techniques that helped enhance the employee and team performances.

Best AI Auto Discovery Tools

All the above Agile techniques help teams and individuals to introspect their actions and understand areas of improvement in real time with periodic qualitative and quantitative feedback. Each deliverable from multiple cross functional teams can be monitored, tracked and assessed under a single roof. All these techniques collectively bring together an enhanced form of delivery and empower each team to realize their full potential.
Above all, Agile techniques help teams to feel the pulse of the customer every step of the way. The openness to change regardless of the phase, helps them to map all the requirements leading to an overall customer satisfaction coupled with employee success.

Top 5 Agile Approaches

Best AIOps Platforms Software

A Truly Agile Organization

Majority of the Agile approach has been concentrated towards development, IT, and Operations. However, organizations should strive towards effective alignment and coordination across all departments. Organizations today are aiming for greater expansion of agility into areas beyond building, deploying, and maintaining software. At the end of the day, Agile is not about the framework. It is all about the Agile values and principles the organizations believe in for achieving their mission and vision in the long run.

About the Author –

Ashish Joseph is a Lead Consultant at GAVS working for a healthcare client in the Product Management space. His areas of expertise lie in branding and outbound product management. He runs a series called #BizPective on LinkedIn and Instagram focusing on contemporary business trends from a different perspective. Outside work, he is very passionate about basketball, music, and food.

Patient 360 & Journey Mapping using Graph Technology

Srinivasan Sundararajan

360 Degree View of Patient

With rising demands for quality and cost-effective patient care, healthcare providers are focusing on data-driven diagnostics while continuing to utilize their hard-earned human intelligence. In other words, data-driven healthcare is augmenting human intelligence.

360 Degree View of Patient, as it is called, plays a major role in delivering the required information to the providers. It is a unified view of all the available information about a patient. It could include but is not limited to the following information:

  • Appointments made by the patients
  • Interaction with different doctors
  • Medications prescribed by the doctors
  • Patient’s relationship to other patients within the eco-systems specially to identify the family history related risks
  • Patient’s admission to hospitals or other healthcare facilities
  • Discharge and ongoing care
  • Patient personal wellness activities
  • Patient billing and insurance information
  • Linkages to the same patient in multiple disparate databases within the same hospital
  • Information about a patient’s involvement in various seminars, medical-related conferences, and other events

Limitations of Current Methods

As evident in most hospitals, these information are usually scattered across multiple data sources/databases. Hospitals typically create a data warehouse by consolidating information from multiple resources and try to create a unified database. However, this approach is done using relational databases and the relational databases rely on joining tables across entities to arrive at a complete picture. The RDBMS is not meant to handle relationships which extend to multiple hops and require drilling down to many levels.

Role of Graph Technology & Graph Databases

A graph database is a collection of nodes (or entities typically) and edges (or relationships). A node represents an entity (for example, a person or an organization) and an edge represents a relationship between the two nodes that it connects (for example, friends). Both nodes and edges may have properties associated with them.

While there are multiple graph databases in the market today like, Neo4J, JanusGraph, TigerGraph, the following technical discussions pertain to graph database that is part of SQL server 2019. The main advantage of this approach is that it helps utilize the best RDBMS features wherever applicable, while keeping the graph database options for complex relationships like 360 degree view of patients, making it a true polyglot persistence architecture.

As mentioned above, in SQL Server 2019 a graph database is a collection of node tables and edge tables. A node table represents an entity in a graph schema. An edge table represents a relationship in a graph. Edges are always directed and connect two nodes. An edge table enables users to model many-to-many relationships in the graph. Normal SQL Insert statements are used to create records into both node and edge tables.

While the node tables and edge tables represent the storage of graph data there are some specialized commands which act as extension of SQL and help with traversing between the nodes to get the full details like patient 360 degree data.

MATCH statement

MATCH statement links two node tables through a link table, such that complex relationships can be retrieved. An example,

Data Center Migration Planning Tools

SHORTEST_PATH statement

It finds the relationship path between two node tables by performing multiple hops recursively. It is one of the useful statements to find the 360 degree of a patient.

There are more options and statements as part of graph processing. Together it will help identify complex relationships across business entities and retrieve them.

GRAPH processing In Rhodium  

As mentioned in my earlier articles (Healthcare Data Sharing & Zero Knowledge Proofs in Healthcare Data Sharing), GAVS Rhodium framework enables Patient and Data Management and Patient Data Sharing and graph databases play a major part in providing patient 360 as well as for provider (doctor) credentialing data. The below screen shots show the samples from reference implementation.

Desktop-as-a-Service (DaaS) Solution

Patient Journey Mapping

Typically, a patient’s interaction with the healthcare service provider goes through a cycle of events. The goal of the provider organization is to make this journey smooth and provide the best care to the patients. It should be noted that not all patients go through this journey in a sequential manner, some may start the journey at a particular point and may skip some intermediate journey points. Proper data collection of events behind patient journey mapping will also help with the future prediction of events which will ultimately help with patient care.

Patient 360 data collection plays a major role in building the patient journey mapping. While there could be multiple definitions, the following is one of the examples of mapping between patient 360-degree events and patient journey mapping.

Digital Transformation Services and Solutions

The below diagram shows an example of a patient journey mapping information.

Enterprise IT Support Services USA

Understanding patients better is essential for improving patient outcomes. 360 degree of patients and patient journey mapping are key components for providing such insights. While traditional technologies lack the need of providing those links, graph databases and graph processing will play a major role in patient data management.

About the Author –

Srini is the Technology Advisor for GAVS. He is currently focused on Data Management Solutions for new-age enterprises using the combination of Multi Modal databases, Blockchain and Data Mining. The solutions aim at data sharing within enterprises as well as with external stakeholders.

IAST: A New Approach to Finding Security Vulnerabilities

Roberto Velasco
CEO, Hdiv Security

One of the most prevalent misconceptions about cybersecurity, especially in the mainstream media and also among our clients, is that to conduct a successful attack against an IT system it is necessary to ‘investigate’ and find a new defect in the target’s system.

However, for most security incidents involving internet applications, it is enough to simply exploit existing and known programming errors.

For instance, the dramatic Equifax breach could have been prevented by following basic software security best-practices, such as patching the system to prevent known vulnerabilities. That was, in fact, one of the main takeaways from the forensic investigation led by the US federal government.

One of the most important ways to reduce security risks is to ensure that all known programming errors are corrected before the system is exposed to internet traffic. Research bodies such as the US NIST found that correcting security bugs early on is orders of magnitude cheaper than doing so when the development has been completed.

When composing a text in a text editor, the spelling and grammar corrector highlights the mistakes in the text. Similarly, there are security tools known as AST (Application Security Testing) that find programming errors that introduce security weaknesses. ASTs report the file and line where the vulnerability is located, in the same way, that a text editor reports the page and the line that contains a typo.

In other words, these tools allow developers to build software that is largely free of security-related programming errors, resulting in more secure applications.

Just like it is almost impossible to catch all errors in a long piece of text, most software contains many serious security vulnerabilities. The fact that some teams do not use any automated help at all, makes these security weaknesses all the most prevalent and easy to exploit.

Let’s take a look at the different types of security issue detection tools also known as ASTs, or vulnerability assessment tools, available in the market.

The Traditional Approach

Two mature technologies capture most of the market: static code analysis (SAST) and web scanners (dynamic analysis or DAST). Each of these two families of tools is focused on a different execution environment.

The SAST static analysis, also known as white-box analysis because the tool has access to the source code of the application, scans the source code looking for known patterns that indicate insecure programming that could lead to a vulnerability.

The DAST dynamic analysis replicates the view of an attacker. At this point, the tool executes hundreds or thousands of queries against the application designed to replicate the activity of an attacker to find security vulnerabilities. This is a black-box analysis because the point of view is purely external, with no knowledge of the application’s internal architecture.

The level of detail provided by the two types of tools is different. SAST tools provide file and line where the vulnerability is located, but no URL, while DAST tools provide the external URL, but no details on the location of the problem within the code base of the application. Some teams use both tools to improve visibility, but this requires long and complex triaging to manage the vulnerabilities.

The Interactive AST Approach

The Interactive Application Security Testing (IAST) tools combine the static approach and the dynamic approach. They have access to the internal structure of the application, and to the way it behaves with actual traffic. This privileged point of view is ideal to conduct security analysis.

From an architecture point of view, the IAST tools become part of the infrastructure that hosts the web applications, because an IAST runs together with the application server. This approach is called instrumentation, and it is implemented by a component known as an agent. Other platforms such as Application Performance Monitoring tools (APMs) share this proven approach.

Once the agent has been installed, it incorporates automatic security sensors in the critical execution points of the application. These sensors monitor the dataflow between requests and responses, the external components that the application includes, and data operations such as database access. This broad-spectrum coverage is much better than the visibility that SAST and DAST rely on.

In terms of specific results, we can look at two important metrics – how many types of vulnerabilities the tool finds, and how many of the identified vulnerabilities are false positives. Well, the best DAST is able to find only 18% of the existing vulnerabilities on a test application. And even worse, around 50% of the vulnerabilities reported by the best SAST static analysis tool are not true problems!

IT Automation with AI

Source: Hdiv Security via OWASP Benchmark public result data

The IAST approach provides these tangible benefits:

  1. Complete coverage, because the entire application is reviewed, both the custom code and the external code, such as open-source components and legacy dependencies.
  2. Flexibility, because it can be used in all environments; development, quality assurance (QA), and production.
  3. High accuracy, because the combination of static and dynamic point of views allow us to find more vulnerabilities with no false positives.
  4. Complete vulnerability information, including the static aspects (source code details) and dynamic aspects (execution details).
  5. Reduction of the duration of the security verification phase, so that the time-to-market of the secure applications is shorter.
  6. Compatible with agile development methodologies, such as DevSecOps, because it can be easily automated, and reduces the manual verification activities

IAST tool can add tons of value to the security tooling of any organization concerned with the security of the software.

In the same way that everyone uses an automated spell checker to find typos in a document, we believe that any team would benefit from an automated validation of the security of an application.

However, the AST does not represent a security utopia, since they can only detect security problems that follow a common pattern.

About the Author –

Roberto Velasco is the CEO of Hdiv Security. He has been involved with the IT and security industry for the past 16 years and is experienced in software development, software architecture and application security across different sectors such as banking, government and energy. Prior to founding Hdiv Security, Roberto worked for 8 years as a software architect and co-founded ARIMA, a company specialized in software architecture. He regularly speaks at Software Architecture and cybersecurity conferences such as Spring I/O and APWG.eu.

Post – Pandemic Recruiting Practices

Prabhakar Kumar Mandal

The COVID pandemic has transformed business as we know it. This includes recruitment. Right from the pre-hire activities to the post-hire ones, no hiring practices will be exempt from change we’re witnessing. To maintain a feasible talent acquisition program now and in the coming years, organizations face a persistent need to reimagine the way they do things at every step of the hiring funnel. 

Enterprise IT Support Services USA

In my perspicacity, following are the key aspects to look at:

1. Transforming Physical Workspaces

Having employees be physically present at workplace is fraught with challenges now. We envision many companies transitioning into a fully or partially remote workforce to save on costs and give employees more flexibility.

This means companies that maintain a physical headquarter will be paying much closer attention to the purpose those spaces really serve—and so will the candidates. The emphasis now will be on spaces of necessity—meeting areas, spaces for collaborative work, and comfortable, individual spaces for essential workers who need to be onsite. 

2. Traveling for interviews will be an obsolete

It’s going to be a while before non-essential travel assumes its pre-corona importance. In a study of traveler attitudes spanning the U.S., Canada, the U.K., and Australia, the portion of people who said they intended to restrict their travel over the next year increased from 24% in the first half of March to 40% in the second half of March.

Candidates will be less willing than they once were to jump on a plane for an in-person interview when a video conference is a viable alternative. 

3. Demand for workers with cross-trained skills will increase

Skills-based hiring has been on the rise now and will keep increasing as businesses strive to do more with a lesser headcount. We anticipate organizations to increasingly seek out candidates who can wear multiple hats. 

Additionally, as machines take on more jobs that were once reserved for people, we will see even greater demand for uniquely human skills like problem solving and creative thinking. Ravi Kumar, president of Infosys Ltd., summed it up perfectly in an interview with Forbes: “machines will handle problem-solving and humans will focus on problem finding.” 

4. Recruiting events will look a lot different 

It’s unclear when large-scale, in-person gatherings like job fairs will be able to resume, but it will likely be a while. We will likely see most events move to a virtual model, which will not only reduce risk but significantly cut costs for those involved. This may open new opportunities to allocate that budget to improve some of the other pertinent recruiting practices on this list. 

Digital Transformation Services and Solutions

5. Time to hire may change dramatically

The current approach is likely to change. For example, that most people who took a new job last year were not searching for one: Somebody came and got them. Businesses seek to fill their recruiting funnel with as many candidates as possible, especially ‘passive candidates’, who are not looking to move. Frequently employers advertise jobs that do not exist, hoping to find people who might be useful later or in a different framework. We are always campaigning the importance of minding our recruiting metrics, which can help us not only to hire more competently but identify interruptions in our recruiting process.

Are there steps in the hiring process, like screening or onboarding, that can be accelerated to balance things out? Are there certain recruitment channels that typically yield faster hires than others that can be prioritized? These are important questions to ask as you analyze the pandemic’s impacts to your hiring funnel. 

6. How AI can be leveraged to screen candidates?

AI is helping candidates get matched with the right companies. There are over 100 parameters to assess the candidates. This reduces wastage of time, money, and resources. The candidates are marked on their core strengths. This helps the recruitment manager to place them in the apt role.

The current situation presents the perfect opportunity for companies to adopt new tools. Organizations can reassess their recruitment processes and strategies through HR-aligned technology.

Post-pandemic hiring strategy

This pertains more to the industries most impacted by the pandemic, like businesses in the hospitality sector, outdoor dining, and travel to name a few. Many of the applicants in this domain have chosen to make the shift towards more promising or booming businesses.

However, once the pandemic blows over and restrictions are lifted, you can expect suffering sectors to come back with major recruitment changes and fierce competition over top talent.

Companies that take this time to act by cultivating relationships and connections with promising talent in their sphere, will have the advantage of gathering valuable data from probable candidates.

About the Author –

Prabhakar is a recruiter by profession and cricketer by passion. His focus is on hiring for the infra verticle. He hails from a small town in Bihar was brought up in Pondicherry. Prabhakar has represented Pondicherry in U-19 cricket (National School Games). In his free time he enjoys reading, working on his health and fitness and spending time with his family and friends.

Zero Knowledge Proofs in Healthcare Data Sharing

Srinivasan Sundararajan

Recap of Healthcare Data Sharing

In my previous article (https://www.gavstech.com/healthcare-data-sharing/), I had elaborated on the challenges of Patient Master Data Management, Patient 360, and associated Patient Data Sharing. I had also outlined how our Rhodium framework is positioned to address the challenges of Patient Data Management and data sharing using a combination of multi-modal databases and Blockchain.

In this context, I have highlighted our maturity levels and the journey of Patient Data Sharing as follows:

  • Single Hospital
  • Between Hospitals part of HIE (Health Information Exchange)
  • Between Hospitals and Patients
  • Between Hospitals, Patients, and Other External Stakeholders

In each of the stages of the journey, I have highlighted various use cases. For example, in the third level of health data sharing between Hospitals and Patients, the use cases of consent management involving patients as well as monetization of personal data by patients themselves are mentioned.

In the fourth level of the journey, you must’ve read about the use case “Zero Knowledge Proofs”. In this article, I would be elaborating on:

  • What is Zero Knowledge Proof (ZKP)?
  • What is its role and importance in Healthcare Data Sharing?
  • How Blockchain Powered GAVS Rhodium Platform helps address the needs of ZKP?

Introduction to Zero Knowledge Proof

As the name suggests, Zero Knowledge Proof is about proving something without revealing the data behind that proof. Each transaction has a ‘verifier’ and a ‘prover’. In a transaction using ZKPs, the prover attempts to prove something to the verifier without revealing any other details to the verifier.

Zero Knowledge Proofs in Healthcare 

In today’s healthcare industry, a lot of time-consuming due diligence is done based on a lack of trust.

  • Insurance companies are always wary of fraudulent claims (which is anyhow a major issue), hence a lot of documentation and details are obtained and analyzed.
  • Hospitals, at the time of patient admission, need to know more about the patient, their insurance status, payment options, etc., hence they do detailed checks.
  • Pharmacists may have to verify that the Patient is indeed advised to take the medicines and give the same to the patients.
  • Patients most times also want to make sure that the diagnosis and treatment given to them are indeed proper and no wrong diagnosis is done.
  • Patients also want to ensure that doctors have legitimate licenses with no history of malpractice or any other wrongdoing.

In a healthcare scenario, either of the parties, i.e. patient, hospital, pharmacy, insurance companies, can take on the role of a verifier, and typically patients and sometimes hospitals are the provers.

While the ZKP can be applied to any of the transactions involving the above parties, currently the research in the industry is mostly focused on patient privacy rights and ZKP initiatives target more on how much or less of information a patient (prover) can share to a verifier before getting the required service based on the assertion of that proof.

Blockchain & Zero Knowledge Proof

While I am not getting into the fundamentals of Blockchain, but the readers should understand that one of the fundamental backbones of Blockchain is trust within the context of pseudo anonymity. In other words, some of the earlier uses of Blockchain, like cryptocurrency, aim to promote trust between unknown individuals without revealing any of their personal identities, yet allowing participation in a transaction.

Some of the characteristics of the Blockchain transaction that makes it conducive for Zero Knowledge Proofs are as follows:

  • Each transaction is initiated in the form of a smart contract.
  • Smart contract instance (i.e. the particular invocation of that smart contract) has an owner i.e. the public key of the account holder who creates the same, for example, a patient’s medical record can be created and owned by the patient themselves.
  • The other party can trust that transaction as long the other party knows the public key of the initiator.
  • Some of the important aspects of an approval life cycle like validation, approval, rejection, can be delegated to other stakeholders by delegating that task to the respective public key of that stakeholder.
  • For example, if a doctor needs to approve a medical condition of a patient, the same can be delegated to the doctor and only that particular doctor can approve it.
  • The anonymity of a person can be maintained, as everyone will see only the public key and other details can be hidden.
  • Some of the approval documents can be transferred using off-chain means (outside of the blockchain), such that participants of the blockchain will only see the proof of a claim but not the details behind it.
  • Further extending the data transfer with encryption of the sender’s private/public keys can lead to more advanced use cases.

Role of Blockchain Consortium

While Zero Knowledge Proofs can be implemented in any Blockchain platform including totally uncontrolled public blockchain platforms, their usage is best realized in private Blockchain consortiums. Here the identity of all participants is known, and each participant trusts the other, but the due diligence that is needed with the actual submission of proof is avoided.

Organizations that are part of similar domains and business processes form a Blockchain Network to get business benefits of their own processes. Such a Controlled Network among the known and identified organizations is known as a Consortium Blockchain.

Illustrated view of a Consortium Blockchain Involving Multiple Other Organizations, whose access rights differ. Each member controls their own access to Blockchain Network with Cryptographic Keys.

Members typically interact with the Blockchain Network by deploying Smart Contracts (i.e. Creating) as well as accessing the existing contracts.

Current Industry Research on Zero Knowledge Proof

Zero Knowledge Proof is a new but powerful concept in building trust-based networks. While basic Blockchain platform can help to bring the concept in a trust-based manner, a lot of research is being done to come up with a truly algorithmic zero knowledge proof.

A zk-SNARK (“zero-knowledge succinct non-interactive argument of knowledge”) utilizes a concept known as a “zero-knowledge proof”. Developers have already started integrating zk-SNARKs into Ethereum Blockchain platform. Zether, which was built by a group of academics and financial technology researchers including Dan Boneh from Stanford University, uses zero-knowledge proofs.

ZKP In GAVS Rhodium

As mentioned in my previous article about Patient Data Sharing, Rhodium is a futuristic framework that aims to take the Patient Data Sharing as a journey across multiple stages, and at the advanced maturity levels Zero Knowledge Proofs definitely find a place. Healthcare organizations can start experimenting and innovating on this front.

Rhodium Patient Data Sharing Journey

IT Infrastructure Managed Services

Healthcare Industry today is affected by fraud and lack of trust on one side, and on the other side growing privacy concerns of the patient. In this context, the introduction of a Zero Knowledge Proofs as part of healthcare transactions will help the industry to optimize itself and move towards seamless operations.

About the Author –

Srini is the Technology Advisor for GAVS. He is currently focused on Data Management Solutions for new-age enterprises using the combination of Multi Modal databases, Blockchain, and Data Mining. The solutions aim at data sharing within enterprises as well as with external stakeholders.

Healthcare Data Sharing

Srinivasan Sundararajan

Patient Care Redefined

The fight against the novel coronavirus has witnessed transformational changes in the way patient care is defined and managed. Proliferation of telemedicine has enabled consultations across geographies. In the current scenario, access to patients’ medical records has also assumed more importance.

The journey towards a solution also taught us that research on patient data is equally important. More the sample data about the infected patients, the better the vaccine/remedy. However, the growing concern about the privacy of patient data cannot be ignored. Moreover, patients who provide their data for medical research should also benefit from a monetary perspective, for their contributions.

The above facts basically point to the need for being able to share vital healthcare data efficiently so that patient care is improved, and more lives are saved.

The healthcare industry needs a data-sharing framework, which shares patient data but also provides much-needed controls on data ownership for various stakeholders, including the patients.

Types of Healthcare Data

  • PHR (Personal Health Record): An electronic record of health-related information on an individual that conforms to nationally recognized interoperability standards and that can be drawn from multiple sources while being managed, shared, and controlled by the individual.
  • EMR (Electronic Medical Record): Health-related information on an individual that can be created, gathered, managed, and consulted by authorized clinicians and staff within one healthcare organization. 
  • EHR (Electronic Health Record): Health-related information on an individual that conforms to nationally recognized interoperability standards and that can be created, managed and consulted by authorized clinicians and staff across more than one healthcare organization. 

In the context of large multi-specialty hospitals, EMR could also be specific to one specialist department and EHR could be the combination of information from various specialist departments in a single unified record.

Together these 3 forms of healthcare data provide a comprehensive view of a patient (patient 360), thus resulting in quicker diagnoses and personalized quality care.

Current Challenges in Sharing Healthcare Data

  • Lack of unique identity for patients prevents a single version of truth. Though there are government-issued IDs like SSN, their usage is not consistent across systems.
  • High cost and error-prone integration options with provider controlled EMR/EHR systems. While there is standardization with respect to healthcare interoperability API specifications, the effort needed for integration is high.
  • Conflict of interest in ensuring patient privacy and data integrity, while allowing data sharing. Digital ethics dictate that patient consent management take precedence while sharing their data.
  • Monetary benefits of medical research on patient data are not passed on to patients. As mentioned earlier, in today’s context analyzing existing patient information is critical to finding a cure for diseases, but there are no incentives for these patients.
  • Data stewardship, consent management, compliance needs like HIPAA, GDPR. Let’s assume a hospital specializing in heart-related issues shares a patient record with a hospital that specializes in eye care. How do we decide which portions of the patient information is owned by which hospital and how the governance is managed?
  • Lack of real-time information attributing to data quality issues and causing incorrect diagnoses.

The above list is not comprehensive but points to some of the issues that are plaguing the current healthcare data-sharing initiatives.

Blockchain for Healthcare Data Sharing

Some of the basic attributes of blockchain are mentioned below:

  • Blockchain is a distributed database, whereby each node of the database can be owned by a different stakeholder (say hospital departments) and yet all updates to the database eventually converge resulting in a distributed single version of truth.
  • Blockchain databases utilize a cryptography-based transaction processing mechanism, such that each object stored inside the database (say a patient record) can be distinctly owned by a public/private key pair and the ownership rights carry throughout the life cycle of the object (say from patient admission to discharge).
  • Blockchain transactions are carried out using smart contracts which basically attach the business rules to the underlying data, ensuring that the data is always compliant with the underlying business rules, making it even more reliable than the data available in traditional database systems.

These underlying properties of Blockchain make it a viable technology platform for healthcare data sharing, as well as to ensure data stewardship and patient privacy rights.

GAVS Rhodium Framework for Healthcare Data Sharing

GAVS has developed a framework – ‘Rhodium’, for healthcare data sharing.

This framework combines the best features of multi-modal databases (relational, nosql, graph) along with the viability of data sharing facilitated by Blockchain, to come up with a unified framework for healthcare data sharing.

The following are the high-level components (in a healthcare context) of the Rhodium framework. As you can see, each of the individual components of Rhodium play a role in healthcare information exchange at various levels.

GAVS’ Rhodium Framework for Healthcare

GAVS has also defined a maturity model for healthcare organizations for utilizing the framework towards healthcare data sharing. This model defines 4 stages of healthcare data sharing:

  • Within a Hospital 
  • Across Hospitals
  • Between Hospitals & Patients
  • Between Hospitals, Patients & Other Agencies

The below progression diagram illustrates how the framework can be extended for various stages of the life cycle, and typical use cases that are realized in each phase. Detailed explanations of various components of the Rhodium framework, and how it realizes use cases mentioned in the different stages will be covered in subsequent articles in this space.

Rhodium Patient Date Sharing Journey

Benefits of the GAVS Rhodium Framework for Healthcare Data Sharing

The following are the general foreseeable benefits of using the Rhodium framework for healthcare data sharing.

AIOps Digital Transformation Solutions

Healthcare Industry Trends with Respect to Data Sharing

The following are some of the trends we are seeing in Healthcare Data Sharing:

  • Interoperability will drive privacy and security improvements
  • New privacy regulations will continue to come up, in addition to HIPAA
  • The ethical and legal use of AI will empower healthcare data security and privacy
  • The rest of 2020 and 2021 will be defined by the duality of data security and data integration, and providers’ ability to execute on these priorities. That, in turn, will, in many ways, determine their effectiveness
  • In addition to industry regulations like HIPAA, national data privacy standards including Europe’s GDPR, California’s Consumer Privacy Act, and New York’s SHIELD Act will further increase the impetus for providers to prioritize privacy as a critical component of quality patient care

The below documentation from the HIMSS site talks about maturity levels with respect to healthcare interoperability, which is addressed by the Rhodium framework.

Source: https://www.himss.org/what-interoperability

This framework is in its early stages of experimentation and is a prototype of how a Blockchain + Multi-Modal Database powered solution could be utilized for sharing healthcare data, that would be hugely beneficial to patients as well as healthcare providers.

About the Author –

Srini is the Technology Advisor for GAVS. He is currently focused on Data Management Solutions for new-age enterprises using the combination of Multi-Modal databases, Blockchain, and Data Mining. The solutions aim at data sharing within enterprises as well as with external stakeholders.

Observability versus Monitoring

Sri Chaganty

“Observability” has become a key trend in Service Reliability Engineering practice.  One of the recommendations from Gartner’s latest Market Guide for IT Infrastructure Monitoring Tools released in January 2020 says, “Contextualize data that ITIM tools collect from highly modular IT architectures by using AIOps to manage other sources, such as observability metrics from cloud-native monitoring tools.”

Like so many other terms in software engineering, ‘observability’ is a term borrowed from an older physical discipline: in this case, control systems engineering. Let me use the definition of observability from control theory in Wikipedia: “observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs.”

Observability is gaining attention in the software world because of its effectiveness at enabling engineers to deliver excellent customer experiences with software despite the complexity of the modern digital enterprise.

When we blew up the monolith into many services, we lost the ability to step through our code with a debugger: it now hops the network.  Monitoring tools are still coming to grips with this seismic shift.

How is observability different than monitoring?

Monitoring requires you to know what you care about before you know you care about it. Observability allows you to understand your entire system and how it fits together, and then use that information to discover what specifically you should care about when it’s most important.

Monitoring requires you to already know what normal is. Observability allows discovery of different types of ‘normal’ by looking at how the system behaves, over time, in different circumstances.

Monitoring asks the same questions over and over again. Is the CPU usage under 80%? Is memory usage under 75% percent? Or, is the latency under 500ms? This is valuable information, but monitoring is useful for known problems.

Observability, on the other side, is about asking different questions almost all the time. You discover new things.

Observability allows the discovery of different types of ‘normal’ by looking at behavior, over time, in different circumstances.

Metrics do not equal observability.

What Questions Can Observability Answer?

Below are sample questions that can be addressed by an effective observability solution:

  • Why is x broken?
  • What services does my service depend on — and what services are dependent on my service?
  • Why has performance degraded over the past quarter?
  • What changed? Why?
  • What logs should we look at right now?
  • What is system performance like for our most important customers?”
  • What SLO should we set?
  • Are we out of SLO?
  • What did my service look like at time point x?
  • What was the relationship between my service and x at time point y?
  • What was the relationship of attributed across the system before we deployed? What’s it like now?
  • What is most likely contributing to latency right now? What is most likely not?
  • Are these performance optimizations on the critical path?

About the Author –

Sri is a Serial Entrepreneur with over 30 years’ experience delivering creative, client-centric, value-driven solutions for bootstrapped and venture-backed startups.

Autonomous Things

Machine learning service provider

Bindu Vijayan

“Autonomous things (AuT), or the Internet of autonomous things (IoAT), is an emerging term for the technological developments that are expected to bring computers into the physical environment as autonomous entities without human direction, freely moving and interacting with humans and other objects…”

To put it simply, Autonomous Things use AI and work unsupervised to complete specific tasks without humans. Devices are enhanced with AI, sensors and analytical capabilities to be able to make informed and appropriate decisions.  They (these devices) work collaboratively between humans and the environment and provide superior performance.  Today AuT work across several environments with various levels of intelligence and capabilities. Some popular examples of these devices are drones, vehicles, smart home devices among others. The components of Autonomous things – software and AI hardware are getting increasingly efficient. With improved technologies (and significantly reducing sensor costs), the variety of tasks and processes that can be automated are increasing, with the advantage of bringing in more data and feedback that can efficiently improve and enhance the benefits of autonomous things.

The technology is used in a wide variety of scenarios – as data collectors from a variety of terrains and environments, as delivery systems (by Amazon, pizza deliveries, etc.), medical supplies to remote areas, etc. Robotics used in the supply chain has proven it reduces/elevates the danger out of the hitherto human tasks in warehouses.  And they probably have the most economic potential currently, followed by autonomous vehicles.  Drones are used to collect data across a wide variety of functions –  for surveillance, security, stock management, weather forecasting, obtaining air data, oceanic data, agricultural planning, etc.

Some fascinating use cases:

Healthcare

Drones are proving to be more and more effective in several ways – they are currently used extensively for surveillance of disaster sites that have biological hazards.  There is no better relevance than the current times when they can actually be used in epidemiology to track disease spread,  and of course for further research and studies.  Drones are facilitating on-demand healthcare by providing medicines to terrains that are difficult to access.  Swoop Aero is one such company that provides medicines via drones.  Drones have brought healthcare into the most remote areas with diagnosis and treatment made available. Remote areas of Africa have their regular medical supplies,  vaccine supplies, lab samples collected, emergency medical equipment made available through Drones. They are also used in telementoring, for perioperative evaluation and so on.  Drones have been very efficient in accessing areas and providing necessary support where ground transport is not reliable or safe or impossible.  Today, most governments have Drones on their national agenda under various sectors. The Delft University of Technology is developing an ambulance drone technology that can be used at disaster sites to increase rescue rates..

Retail

In a world where we have virtual assistants do grocery shopping, replenish stocks, and cooking machines making food, when there is a need to go out shopping, shoppers want to have an easy, fast and frictionless process.  Today, customers do not want to wait in queues and go through conventional checkouts, and Retailers know that they might be losing customers due to their checkout process.  And autonomous shops like Amazon Go are giving that experience to customers where they can purchase without the inconvenience of checkout lines.

Providers of checkout-free shopping technology like ‘Grabango’, use sensor vision and ML to actually hold a virtual shopping basket for every person in the store.  The technology is reputed to process a multitude simultaneous checkout transactions. “Grabango’s system uses high-quality sensor hardware and high-precision computer algorithms to acquire the location of every item in the store. This results in a real-time planogram covering the entire retail environment.” They say it results in increased sales and loyalty, streamlined operations and inventory management and out of stock alerts.

Construction

Companies like Chicago based, Komatsu American Corp., have autonomous haulage stems that have optimized safety in the mining industry like never before. They “help you continue to meet your bottom line while achieving zero-harm” while their focus has been on developing autonomous mining solutions, they have been doing it for more than three decades now! Their FrontRunner AHS has moved more than two billion tons of surface material so far in driverless operations.  Catepillar would be deploying their fleet of autonomous trucks and blast drills for the iron mine in Western Australia – Rio Tinto Koodaideri.  The industry is thriving with autonomous and semi autonomous equipment, and it is evident that it has brought improvements to productivity, and increased profitability. At the Australian mine “autonomous vehicles operated on average 700 hours longer and with 15 per cent lower unit costs”… Similarly, there are other companies like Intsite, a heavy machinery company; their autonomous crane ÁutoSite 100’ does autonomous operation of heavy machinery.

Transportation

Most of us think Tesla when we think autonomous vehicles.  Elon Musk’s dream of providing autonomous ride-sharing has Tesla working on getting out one million robotaxis on the road this year. We will have to wait and see how that is going to pan out. Though autonomous vehicles are the most popular, I suppose it might take a little more time before it finds answers to the regulatory challenges, definitely not an easy task.  It gets quite overwhelming when we think of what we are expecting from autonomous vehicles – it assumes correct performance no matter the uncertainties on the roads and the environment, as well as the ability to face any sort of system failures on its own, and AI is a very critical technology when we are talking real-time decision making. Those sort of scenarios call for a strong computing platform in order to do the analysis at the edge for faster decision making.  The new V2X, which is the 5G vehicle-to-everything is expected to make autonomous vehicles mainstream because the vital information would get transmitted as structured data to the vehicle. V2X is expected to have vehicles interfacing with anything, be it pedestrians, roadside infrastructure, cyclists, etc.

Today, technology is also looking at ‘vehicle platooning’ – “Platoons decrease the distances between cars or trucks using electronic, and possibly mechanical, coupling. This capability would allow many cars or trucks to accelerate or brake simultaneously. This system also allows for a closer headway between vehicles by eliminating reacting distance needed for human reaction.” It has a group of self driving vehicles moving at high speed but safely, as the trucks are in constant communication with each other and use this intelligence to make informed decisions like braking, speeds, etc.  And autonomous trucks and cars can automatically join these platoons or leave, this has the advantages of reduced congestion, fewer traffic collisions, better fuel economy, and shorter commutes during peak hours. 

Conclusion

Studies show that Autonomous things are fast moving to ‘swarm’ or a bunch of intelligent devices, where multiple devices will function together collaboratively, as against the previously isolated intelligent components/ things. They are going to be intelligently networked among themselves and with the environment, and the wider that becomes within every industry, they are going to show phenomenal capabilities. But let’s not forget there is a whole other side to AI, given how unpredictable things are in life, AI would sooner or later have to respond to things that it never saw in training… we still are the smarter ones…

References:

https://en.wikipedia.org/wiki/Autonomous_things

https://www.gartner.com/smarterwithgartner/gartner-top-10-strategic-technology-trends-for-2020/

https://worldline.com/en/home/blog/2020/march/from-automatic-to-autonomous-payments-can-things-pay.html

https://en.wikipedia.org/wiki/Self-driving_car

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6174005/

https://www.komatsuamerica.com/

https://en.wikipedia.org/wiki/Platoon_(automobile)https://grabango.com/

Smart Spaces Tech Trends for 2020

data center as a service providers in usa

Priyanka Pandey

These are unprecedented times. The world hadn’t witnessed such a disruption in recent history. It is times like these test the strength and resilience of our community. While we’ve been advised to maintain social distancing to flatten to curve, we must keep the wheels of the economy rolling.

In my previous article, I covered the ‘People-Centric’ Tech Trends of the year, i.e., Hyper automation, Multiexperience, Democratization, Human Augmentation and Transparency and Traceability. All of those hold more importance now in the light of current events. Per Gartner, Smart Spaces enable people to interact with people-centric technologies. Hence, the next Tech Trends in the list are about creating ‘Smart Spaces’ around us.

Smart spaces, in simple words, are interactive physical environments decked out with technology, that act as a bridge between humans and the digital world. The most common example of a smart space is a smart home, also called as a connected home. Other environments that could be a smart space are offices and communal workspaces; hotels, malls, hospitals, public places such as libraries and schools, and transportation portals such as airports and train stations. Listed below are the 5 Smart Spaces Technology Trends which, per Gartner, have great potential for disruption.

Trend 6: Empowered Edge

Edge computing is a distributed computing topology in which information processing and data storage are located closer to the sources, repositories and consumers of this information. Empowered Edge is about moving towards a smarter, faster and more flexible edge by using more adaptive processes, fog/mesh architectures, dynamic network topology and distributed cloud. This trend will be introduced across a spectrum of endpoint devices which includes simple embedded devices (e.g., appliances, industrial devices), input/output devices (e.g., speakers, screens), computing devices (e.g., smartphones, PCs) and complex embedded devices (e.g., automobiles, power generators). Per Gartner predictions, by 2022, more than 50% of enterprise-generated data will be created and processed outside the data center or cloud. This trend also includes the next-generation cellular standard after 4G Long Term Evolution (LTE), i.e., 5G. The concept of edge also percolates to the digital-twin models.

Trend 7: Distributed Cloud

Gartner defines a distributed cloud as “distribution of public cloud services to different locations outside the cloud providers’ data centers, while the originating public cloud provider assumes responsibility for the operation, governance, maintenance and updates.” Cloud computing has always been viewed as a centralized service, although, private and hybrid cloud options compliments this model. Implementing private cloud is not an easy task and hybrid cloud breaks many important cloud computing principles such as shifting the responsibility to cloud providers, exploiting the economics of cloud elasticity and using the top-class services of large cloud service providers. A distributed cloud provides services in a location which meets organization’s requirements without compromising on the features of a public cloud. This trend is still in the early stages of development and is expected to build in three phases:

Phase 1: Services will be provided from a micro-cloud which will have a subset of services from its centralized cloud.

Phase 2: An extension to phase 1, where service provider will team up with a third-party to deliver subset of services from the centralized cloud.

Phase 3: Distributed cloud substations will be setup which could be shared by different organizations. This will improve the economics associated as the installation cost can be split among the companies.

Trend 8: Autonomous Things

Autonomous can be defined as being able to control oneself. Similarly, Autonomous Things are devices which can operate by themselves without human intervention using AI to automate all their functions. The most common among these devices are robots, drones, and aircrafts. These devices can operate across different environments and will interact more naturally with their surroundings and people. While exploring use cases of this technology, understanding the different spaces the device will interact to, is very important like the people, terrain obstacles or other autonomous things. Another aspect to consider would be the level of autonomy which can be applied. The different levels are: No automation, Human-assisted automation, Partial automation, Conditional automation, High automation and Full automation. With the proliferation of this trend, a shift is expected from stand-alone intelligent things to collaborative intelligent things in which multiple devices work together to deliver the final output. The U.S. Defense Advanced Research Projects Agency (DARPA) is studying the use of drone swarms to defend or attack military targets.

Trend 9: Practical Blockchain

Most of us have heard about Blockchain technology. It is a tamper-proof, decentralized, distributed database that stores blocks of records linked together using cryptography. It holds the power to take industries to another level by enabling trust, providing transparency, reducing transaction settlement times and improving cash flow. Blockchain also makes it easy to trail assets back to its origin, reducing the chances of substituting it with counterfeit products. Smart contracts are used as part of the blockchain which can trigger actions on encountering any change in the blockchain; such as releasing payment when goods are received. New developments are being introduced in public blockchains but over time these will be integrated with permissioned blockchains which supports membership, governance and operating model requirements. Some of the use cases of this trend that Gartner has identified are: Asset Tracking, Identity Management/Know Your Client (KYC), Internal Record Keeping, Shared Record Keeping, Smart Cities/the IoT, Trading, Blockchain-based voting, Cryptocurrency payments and remittance services. Per the 2019 Gartner CIO Survey, in the next three years 60% of CIOs expect blockchain deployment in some way.

Trend 10: AI Security

Per Gartner, over the next five years AI-based decision-making will be applied across a wide set of use cases which will result in a tremendous increase of potential attack surfaces. Gartner provides three key perspectives on how AI impacts security: protecting AI-powered systems, leveraging AI to enhance security defense and anticipating negative use of AI by attackers. ML pipelines have different phases and at each of these phases there are various kinds of risks associated. AI-based security tools can be very powerful extension to toolkits with use cases such as security monitoring, malware detection, etc. On the other hand, there are many AI-related attack techniques which include training data poisoning, adversarial inputs and model theft and per Gartner predictions, through 2022, 30% of all AI cyberattacks will leverage these attacking techniques. Every innovation in AI can be exploited by attackers for finding new vulnerabilities. Few of the AI attacks that security professionals must explore are phishing, identity theft and DeepExploit.

One of the most important things to note here is that the trends listed above cannot exist in isolation. IT leaders must analyse what combination of these trends will drive the most innovation and strategy fitting it into their business models. Soon we will have smart spaces around us in forms of factories, offices and cities with increasingly insightful digital services everywhere for an ambient experience.

Sources:

https://www.pcmag.com/news/gartners-top-10-strategic-technology-trends-for-2020

About the Author:

Priyanka is an ardent feminist and a dog-lover. She spends her free time cooking, reading poetry and exploring new ways to conserve the environment.

People-Centric Technology Trends for 2020

Priyanka Pandey

“It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.”  

– Charles Darwin, 1809

Most of us know about ‘Darwinism’ or the theory of biological evolution by Charles Darwin. It talks about natural selection and inheriting variations by organisms to increase its ability to compete and survive. Over the years, there have been many examples that prove this theory applies not only to biological evolution, but also to the technological evolution. With technologies advancing with ever-increasing velocity, it becomes a necessity for Technology Innovation Leaders to adapt to these changes with least friction. What can come handy is the annual analysis report released by Gartner, which galvanizes different technology trends together.

The Gartner report for this year – ‘Gartner Top 10 Strategic Technology Trends for 2020’ is structured around the idea of “people-centric smart spaces”. It examines how certain technologies can create numerous opportunities and can drive disruptions in a way that will change how we, humans, will live in the coming years. This report makes it very clear that Artificial Intelligence (AI) plays a bottom-line role in providing a good ambient experience. It also talks about how important it is to have governing principles, policies, best practices and technology architectures to increase transparency and trust in AI.

At Gartner 2019 IT Symposium/Xpo™ in Orlando, Florida, Brian Burke, Gartner Research VP, said “These trends have a profound impact on the people and the spaces they inhabit. Rather than building a technology stack and then exploring the potential applications, organizations must consider the business and human context first.”

Workplaces are becoming more people-centric, putting people at the centre of any technology strategy. Organisations are now making significant investments in user experience to meet growing expectations. Listed below are the 5 People-centric Technology Trends which, per Gartner, have great potential for disruption.

Trend 1: Hyperautomation

Automation can broadly be defined as employing technology to perform routine tasks thus reducing the human involvement. Hyperautomation is the next step. This trend is about providing an end-to-end automation solution using a wide array of machine learning algorithms, packaged software and automation tools. Almost every forward-looking company is now looking at processes it can automate and are also aware of the potential of Robotic process automation (RPA) and Intelligent Business Process Management Suite (iBPMS) in achieving this. Since it involves automating every process of an entire organization like operations model and business process model, it often results in the creation of a dynamic, virtual representation of that organization, also called Digital Twin of Organization (DTO). DTO helps align employees with the organization’s goals and operations and evaluates the impact of changes in a constrained environment.

Trend 2: Multiexperience

Multiexperience is about changing the way people interact with the digital world by providing them with a multi-modal user interface that will utilize multisensory and multitouchpoints. It will use all human senses along with advanced technological senses (like heat, humidity, radar) to connect across different devices including traditional computing devices, wearables, automobiles, environmental sensors, and consumer appliances. These widened sensory channels will support varied capabilities, such as emotion recognition through facial expression analysis or using accelerometers to identify abnormal movements that may indicate a health condition. Both Google and Amazon are already working on providing multi experience by adding screens and cameras to their smart speakers.

Per Gartner, “By 2023, more than 25% of the mobile apps, progressive web apps, and conversational apps at large enterprises will be built and/or run through a multiexperience development platform.” A multiexperience development platform (MXDP) is a suite that offers both front-end and back-end services for the development of an ambient experience that can be integrated across a range of devices. Although predictions state that privacy concerns can impact the level of adoption in many organizations, but it is still expected to evolve through 2024.

Trend 3: Democratization

Democratization is about empowering everyone through access to technical and business level expertise without extensive and costly training. The target of this trend could be anyone from customers, business partners to professional application developers and assembly line workers. Democratization has four key aspects: Democratization of Application Development, Democratization of Data and Analytics, Democratization of Design and Democratization of Knowledge.

It also talks about dealing with “Shadow AI”. Shadow AI is a natural outcome of democratization where people without formal training can use tools to develop their own AI-powered solutions and provide peer-to-peer support to others. Low-code or No-code application development has seen an increase due to the rising demand for rapid application development platforms. Per Gartner, by 2024, more than 65% of application development will be based on low-code development and 75% of large enterprises will be using at least four such tools for both IT application development and citizen development initiatives.

Trend 4: Human Augmentation

Human Augmentation (or “Human 2.0”) aims at enhancing human capabilities through technology. It may seem to be a new trend, but it started even before the computers were introduced. It goes back to the time when the usage of typewriter, copy machine and printing press started, which gave humans the ability to create, copy and publish text. This trend includes combining different innovations to deliver cognitive and physical improvements as part of human experience. Physical augmentation has several aspects like Sensory Augmentation, Brain Augmentation, Genetic Augmentation and, Appendage and Biological Function Augmentation. Since human augmentation will affect human lives to a very great extent, organizations must consider five major areas before adapting to it: Security, Privacy, Compliance, Health impact, Ethics. All types of enterprises are now examining ways of implementing human augmentation in different business use cases.

Trend 5: Transparency and Traceability

As consumers are becoming more aware, they want a guarantee of the products they consume and demand control over their personal information. With increasing AI-based decision making, the concern of digital ethics and privacy needs are rising, and transparency and traceability are critical in supporting it. This trend focuses on six pillars of trust: Ethics, Integrity, Openness, Accountability, Competence and Consistency. This is the age of surveillance capitalism. There are billions of endpoints collecting information of each one of us through which it is not difficult to identify who you are, where you are, what you’re doing and even what you’re thinking! Many jurisdictions, including Europe, South Africa, South Korea and China, have the ‘Right to be forgotten’ (RTBF) legislation in place. Gartner predicts that by 2023, over 75% of large organizations will employ AI experts in behaviour forensic, privacy, and customer trust to reduce brand and reputation risk and by 2025, 30% of government and large enterprise contracts for purchase of digital products and services using AI will require the use of explainable and ethical AI.

One of the most important things to note here is that the trends listed above cannot exist in isolation. For people-centric technologies to provide digital services, people need an environment where they can interact with digital spaces as a natural part of their everyday life. This brings in the concept of Smart Spaces. The next 5 trends in the Gartner report explores the technologies around Smart Spaces. IT leaders will have to analyse what combination of the above-mentioned trends along with smart space technologies, will drive the most innovation and strategy fitting it into their business models.

To Be Continued…         

References:

https://www.gartner.com/en/doc/432920-top-10-strategic-technology-trends-for-2020

About the Author:

Priyanka is an ardent feminist and a dog-lover. She spends her free time cooking, reading poetry and exploring new ways to conserve the environment.