“RE-ENGINEERING” BUSINESSES – THINK “AI” led STRATEGY
AI adoption across industries is galloping at a rapid pace and resulting benefits are increasing by the day, some businesses are challenged by the complexity and confusion that AI can generate. Enterprises can get stuck trying to analyse all that’s possible and all that they could do through Ai, when they should be taking that next step of recognizing what’s important and what they should be doing — for their customers, stakeholders, and employees. Discovering real business opportunities and achieving desired outcomes can be elusive. To overcome this, enterprises should pursue a constant attempt to re-engineer their AI strategy to generate insights & intelligence that leads to real outcomes
Re-engineering Data Architecture & Infrastructure
To successfully derive value from data immediately, there is a need for faster data analysis than is currently available using traditional data management technology. With the explosion of digital analytics, social media, and the “Internet of things” (IoT) there is an opportunity to radically re-engineer data architecture to provide organizations with a tiered approach to data collection, with real-time and historical data analyses. Infrastructure-as-a-service for AI is the combination of components that enables architecture that delivers the right business outcomes. Developing this architecture involves aspects of design of the cluster computing power, networking, and innovations in software that enable advanced technology services and interconnectivity. Infrastructure is the foundation for optimal processing and storage of data and is an important which is also the foundation for any data farm.
The new era of AI led infrastructure is virtualized (analytics) environments also can be referred to as the next Big “V” of big data. The virtualization infrastructure approach has several advantages, such as scalability, ease of maintenance, elasticity, cost savings due better utilization of resources, and the abstraction of the external layer from the internal implementation (back-end) of a service or resource. Containers are the trending technology making headlines recently, which is an approach to virtualization and cloud-enabled data centres. Fortune 500 companies have begun to “containerize” their servers, data centre and cloud applications with Docker. Containerization excludes all of the problems of virtualization by eliminating hypervisor and its VMs. Each application is deployed in its own container, which runs on the “bare metal” of the server plus a single, shared instance of the operating system.
AI led Business Process Re-Engineering
The BPR methodologies of the past have significantly contributed to the development of today’s enterprises. However, today’s business landscape has become increasingly complex and fast-paced. The regulatory environment is also constantly changing. Consumers have become more sophisticated and have easy access to information, on-the-go. Staying competitive in the present business environment requires organizations to go beyond process efficiencies, incremental improvements and enhancing transactional flow. Now, organizations need to have a comprehensive understanding of its business model through an objective and realistic grasp of its business processes. This entails having organization-wide insights that show the interdependence of various internal functions while taking into consideration regulatory requirements and shifting consumer tastes.
Data is the basis on which fact-based analysis is performed to obtain objective insights of the organization. In order to obtain organization-wide insights, management needs to employ AI capabilities on data that resides both inside and outside its organization. However, an organization’s AI capabilities are primarily dependent on the type, amount and quality of data it possesses.
The integration of an organization’s three key dimensions of people, process and technology is also critical during process design. The people are the individuals responsible and accountable for the organization’s processes. The process is the chain of activities required to keep the organization running. The technology is the suite of tools that support, monitor and ensure consistency in the application of the process. The integration of all these, through the support of a clear governance structure, is critical in sustaining a fact-based driven organizational culture and the effective capture, movement and analysis of data. Designing processes would then be most effective if it is based on data-driven insights and when AI capabilities are embedded into the re-engineered processes. Data-driven insights are essential in gaining a concrete understanding of the current business environment and utilizing these insights is critical in designing business processes that are flexible, agile and dynamic.
Re-engineering Customer Experience (CX) – The new paradigm
It’s always of great interest to me to see new trends emerge in our space. One such trend gaining momentum is enterprise looking at solving customer needs & expectations with what I’d describe as re-engineering customer experience . Just like everything else in our industry, changes in consumer behaviour caused by mobile and social trends are disrupting the CX space. Just a few years ago, web analytics solutions gave brands the best view into performance of their digital business and user behaviours. Fast-forward to today, and this is often not the case. With the growth in volume and importance of new devices, digital channels and touch points, CX solutions are now just one of the many digital data silos that brands need to deal with and integrate into the full digital picture. While some vendors may now offer ways for their solutions to run in different channels and on a range of devices, these capabilities are often still a work in progress. Many enterprises today find their CX solution as another critical set of insights that must be downloaded daily into a omni-channel AI data store and then run visualization to provide cross-channel business reporting.
Re-shaping Talent Acquisition and Engagement with AI
AI s is causing disruption in virtually every function but talent acquisition t is one of the more recent to get a business refresh. A new data driven approach to talent management is reshaping the way organizations find and hire staff, while the power of talent analytics is also changing how HR tackles employee retention and engagement. The implications for anyone hoping to land a job, and for businesses that have traditionally relied on personal relationships are extreme, but robots and algorithms will not yet completely replace human interaction.AI will certainly help to identify talent in specific searches. rather than relying on a rigorous interview process and resume, employers are able to “mine” through deep reserves of information, including from your online footprint. The real value will be in identifying personality types, abilities, and other strengths to help create well-rounded teams. Also, companies are also using people analytics to understand the stress levels of their employees to ensure long-term productiveness and wellness.
The Final Word
Based on my experiences with clients across enterprises , GCCs ,start-ups ; alignment among the three key dimensions of talent, process and AI led technology within a robust governance structure are critical to effectively utilize AI and remain competitive in the current business environment. AI is able to open doors to growth & scalability through insights & intelligence resulting in the identification of industry white spaces. It enhances operational efficiency through process improvements based on relevant and fact-based data. It is able to enrich human capital through workforce analysis resulting in more effective human capital management. It is able to mitigate risks by identifying areas of regulatory and company policy non-compliance before actual damage is done. AI led re-engineering approach unleashes the potential of an organization by putting the facts and the reality into the hands of the decision makers.
(AIQRATE, A bespoke global AI advisory and consulting firm. A first in its genre, AIQRATE provides strategic AI advisory services and consulting offerings across multiple business segments to enable clients on their AI powered transformation & innovation journey and accentuate their decision making and business performance.
AIQRATE works closely with Boards, CXOs and Senior leaders advising them on navigating their Analytics to AI journey with the art of possible or making them jumpstart to AI rhythm with AI@scale approach followed by consulting them on embedding AI as core to business strategy within business functions and augmenting the decision-making process with AI. We have proven bespoke AI advisory services to enable CXO’s and Senior Leaders to curate & design building blocks of AI strategy, embed AI@scale interventions and create AI powered organizations.
AIQRATE’s path breaking 50+ AI consulting frameworks, assessments, primers, toolkits and playbooks enable Indian & global enterprises, GCCs, Startups, SMBs, VC/PE firms, and Academic Institutions enhance business performance and accelerate decision making.
AIQRATE also consults with Consulting firms , Technology service providers , Pure play AI firms , Technology behemoths & Platform enterprises on curating differentiated & bespoke AI capabilities & offerings , market development scenarios & GTM approaches
Visit www.aiqrate.ai to experience our AI advisory services & consulting offerings)
Data Driven Enterprise – Part II: Building an operative data ecosystems strategy
Ecosystems—interconnected sets of services in a single integrated experience—have emerged across a range of industries, from financial services to retail to healthcare. Ecosystems are not limited to a single sector; indeed, many transcend multiple sectors. For traditional incumbents, ecosystems can provide a golden opportunity to increase their influence and fend off potential disruption by faster-moving digital attackers. For example, banks are at risk of losing half of their margins to fintechs, but they have the opportunity to increase margins by a similar amount by orchestrating an ecosystem.
In my experience, many ecosystems focus on the provision of data: exchange, availability, and analysis. Incumbents seeking to excel in these areas must develop the proper data strategy, business model, and architecture.
What is a data ecosystem?
Simply put, a data ecosystem is a platform that combines data from numerous providers and builds value through the usage of processed data. A successful ecosystem balances two priorities:
Building economies of scale by attracting participants through lower barriers to entry. In addition, the ecosystem must generate clear customer benefits and dependencies beyond the core product to establish high exit barriers over the long term.Cultivating a collaboration network that motivates a large number of parties with similar interests (such as app developers) to join forces and pursue similar objectives. One of the key benefits of the ecosystem comes from the participation of multiple categories of players (such as app developers and app users).
What are the data-ecosystem archetypes?
As data ecosystems have evolved, five archetypes have emerged. They vary based on the model for data aggregation, the types of services offered, and the engagement methods of other participants in the ecosystem.
- Data utilities. By aggregating data sets, data utilities provide value-adding tools and services to other businesses. The category includes credit bureaus, consumer-insights firms, and insurance-claim platforms.
- Operations optimization and efficiency centers of excellence. This archetype vertically integrates data within the business and the wider value chain to achieve operational efficiencies. An example is an ecosystem that integrates data from entities across a supply chain to offer greater transparency and management capabilities.
- End-to-end cross-sectorial platforms. By integrating multiple partner activities and data, this archetype provides an end-to-end service to the customers or business through a single platform. Car reselling, testing platforms, and partnership networks with a shared loyalty program exemplify this archetype.
- Marketplace platforms. These platforms offer products and services as a conduit between suppliers and consumers or businesses. Amazon and Alibaba are leading examples.
- B2B infrastructure (platform as a business). This archetype builds a core infrastructure and tech platform on which other companies establish their ecosystem business. Examples of such businesses are data-management platforms and payment-infrastructure providers.
The ingredients for a successful data ecosystem : Data ecosystems have the potential to generate significant value. However, the entry barriers to establishing an ecosystem are typically high, so companies must understand the landscape and potential obstacles. Typically, the hardest pieces to figure out are finding the best business model to generate revenues for the orchestrator and ensuring participation.
If the market already has a large, established player, companies may find it difficult to stake out a position. To choose the right partners, executives need to pinpoint the value they can offer and then select collaborators who complement and support their strategic ambitions. Similarly, companies should look to create a unique value proposition and excellent customer experience to attract both end customers and other collaborators. Working with third parties often requires additional resources, such as negotiating teams supported by legal specialists to negotiate and structure the collaboration with potential partners. Ideally, partnerships should be mutually beneficial arrangements between the ecosystem leader and other participants.
As companies look to enable data pooling and the benefits it can generate, they must be aware of laws regarding competition. Companies that agree to share access to data, technology, and collection methods restrict access for other companies, which could raise anti-competition concerns. Executives must also ensure that they address privacy concerns, which can differ by geography.
Other capabilities and resources are needed to create and build an ecosystem. For example, to find and recruit specialists and tech talent, organizations must create career opportunities and a welcoming environment. Significant investments will also be needed to cover the costs of data-migration projects and ecosystem maintenance.
Ensuring ecosystem participants have access to data
Once a company selects its data-ecosystem archetype, executives should then focus on setting up the right infrastructure to supports its operation. An ecosystem can’t deliver on its promise to participants without ensuring access to data, and that critical element relies on the design of the data architecture. We have identified five questions that incumbents must resolve when setting up their data ecosystem.
How do we exchange data among partners in the ecosystem?
Industry experience shows that standard data-exchange mechanisms among partners, such as cookie handshakes, for example, can be effective. The data exchange typically follows three steps: establishing a secure connection, exchanging data through browsers and clients, and storing results centrally when necessary.
How do we manage identity and access?
Companies can pursue two strategies to select and implement an identity-management system. The more common approach is to centralize identity management through solutions such as Okta, OpenID, or Ping. An emerging approach is to decentralize and federate identity management—for example, by using blockchain ledger mechanisms.
How can we define data domains and storage?
Traditionally, an ecosystem orchestrator would centralize data within each domain. More recent trends in data-asset management favor an open data-mesh architecture . Data mesh challenges conventional centralization of data ownership within one party by using existing definitions and domain assets within each party based on each use case or product. Certain use cases may still require centralized domain definitions with central storage. In addition, global data-governance standards must be defined to ensure interoperability of data assets.
How do we manage access to non-local data assets, and how can we possibly consolidate?
Most use cases can be implemented with periodic data loads through application programming interfaces (APIs). This approach results in a majority of use cases having decentralized data storage. Pursuing this environment requires two enablers: a central API catalog that defines all APIs available to ensure consistency of approach, and strong group governance for data sharing.
How do we scale the ecosystem, given its heterogeneous and loosely coupled nature?
Enabling rapid and decentralized access to data or data outputs is the key to scaling the ecosystem. This objective can be achieved by having robust governance to ensure that all participants of the ecosystem do the following:
- Make their data assets discoverable, addressable, versioned, and trustworthy in terms of accuracy
- Use self-describing semantics and open standards for data exchange
- Support secure exchanges while allowing access at a granular level
The success of a data-ecosystem strategy depends on data availability and digitization, API readiness to enable integration, data privacy and compliance—for example, General Data Protection Regulation (GDPR)—and user access in a distributed setup. This range of attributes requires companies to design their data architecture to check all these boxes.
As incumbents consider establishing data ecosystems, we recommend they develop a road map that specifically addresses the common challenges. They should then look to define their architecture to ensure that the benefits to participants and themselves come to fruition. The good news is that the data-architecture requirements for ecosystems are not complex. The priority components are identity and access management, a minimum set of tools to manage data and analytics, and central data storage.Truly mentioning , Developing an operative data ecosystem strategy is far more difficult than getting the tech requirements right.
Data Driven Enterprise – Part I: Building an effective Data Strategy for competitive edge
Few Enterprises take full advantage of data generated outside their walls. A well-structured data strategy for using external data can provide a competitive edge. Many enterprises have made great strides in collecting and utilizing data from their own activities. So far, though, comparatively few have realized the full potential of linking internal data with data provided by third parties, vendors, or public data sources. Overlooking such external data is a missed opportunity. Organizations that stay abreast of the expanding external-data ecosystem and successfully integrate a broad spectrum of external data into their operations can outperform other companies by unlocking improvements in growth, productivity, and risk management.
The COVID-19 crisis provides an example of just how relevant external data can be. In a few short months, consumer purchasing habits, activities, and digital behavior changed dramatically, making preexisting consumer research, forecasts, and predictive models obsolete. Moreover, as organizations scrambled to understand these changing patterns, they discovered little of use in their internal data. Meanwhile, a wealth of external data could—and still can—help organizations plan and respond at a granular level. Although external-data sources offer immense potential, they also present several practical challenges. To start, simply gaining a basic understanding of what’s available requires considerable effort, given that the external-data environment is fragmented and expanding quickly. Thousands of data products can be obtained through a multitude of channels—including data brokers, data aggregators, and analytics platforms—and the number grows every day. Analyzing the quality and economic value of data products also can be difficult. Moreover, efficient usage and operationalization of external data may require updates to the organization’s existing data environment, including changes to systems and infrastructure. Companies also need to remain cognizant of privacy concerns and consumer scrutiny when they use some types of external data.
These challenges are considerable but surmountable. This blog series discusses the benefits of tapping external-data sources, illustrated through a variety of examples, and lays out best practices for getting started. These include establishing an external-data strategy team and developing relationships with data brokers and marketplace partners. Company leaders, such as the executive sponsor of a data effort and a chief data and analytics officer, and their data-focused teams should also learn how to rigorously evaluate and test external data before using and operationalizing the data at scale.
External-data success stories: Companies across industries have begun successfully using external data from a variety of sources . The investment community is a pioneer in this space. To predict outcomes and generate investment returns, analysts and data scientists in investment firms have gathered “alternative data” from a variety of licensed and public data sources, many of which draw from the “digital exhaust” of a growing number of technology companies and the public web. Investment firms have established teams that assess hundreds of these data sources and providers and then test their effectiveness in investment decisions.
A broad range of data sources are used, and these inform investment decisions in a variety of ways:
- Investors actively gather job postings, company reviews posted by employees, employee-turnover data from professional networking and career websites, and patent filings to understand company strategy and predict financial performance and organizational growth.
- Analysts use aggregated transaction data from card processors and digital-receipt data to understand the volume of purchases by consumers, both online and offline, and to identify which products are increasing in share. This gives them a better understanding of whether traffic is declining or growing, as well as insights into cross-shopping behaviors.
- Investors study app downloads and digital activity to understand how consumer preferences are changing and how effective an organization’s digital strategy is relative to that of its peers. For instance, app downloads, activity, and rating data can provide a window into the success rates of the myriad of live-streaming exercise offerings that have become available over the last year.
Corporations have also started to explore how they can derive more value from external data . For example, a large insurer transformed its core processes, including underwriting, by expanding its use of external-data sources from a handful to more than 40 in the span of two years. The effort involved was considerable; it required prioritization from senior leadership, dedicated resources, and a systematic approach to testing and applying new data sources. The hard work paid off, increasing the predictive power of core models by more than 20 percent and dramatically reducing application complexity by allowing the insurer to eliminate many of the questions it typically included on customer applications.
Three steps to creating value with external data:
Use of external data has the potential to be game changing across a variety of business functions and sectors. The journey toward successfully using external data has three key steps.
1. Establish a dedicated team for external-data sourcing
To get started, organizations should establish a dedicated data-sourcing team. Per our understanding at AIQRATE , a key role on this team is a dedicated data scout or strategist who partners with the data-analytics team and business functions to identify operational, cost, and growth improvements that could be powered by external data. This person also would be responsible for building excitement around what can be made possible through the use of external data, planning the use cases to focus on, identifying and prioritizing data sources for investigation, and measuring the value generated through use of external data. Ideal candidates for this role are individuals who have served as analytics translators and who have experience in deploying analytics use cases and in working with technology, business, and analytics profiles.
The other team members, who should be drawn from across functions, would include purchasing experts, data engineers, data scientists and analysts, technology experts, and data-review-board members . These team members typically spend only part of their time supporting the data-sourcing effort. For example, the data analysts and data scientists may already be supporting data cleaning and modeling for a specific use case and help the sourcing work stream by applying the external data to assess its value. The purchasing expert, already well versed in managing contracts, will build specialization on data-specific licensing approaches to support those efforts.
Throughout the process of finding and using external data, companies must keep in mind privacy concerns and consumer scrutiny, making data-review roles essential peripheral team members. Data reviewers, who typically include legal, risk, and business leaders, should thoroughly vet new consumer data sets—for example, financial transactions, employment data, and cell-phone data indicating when and where people have entered retail locations. The vetting process should ensure that all data were collected with appropriate permissions and will be used in a way that abides by relevant data-privacy laws and passes muster with consumer.This team will need a budget to procure small exploratory data sets, establish relationships with data marketplaces (such as by purchasing trial licenses), and pay for technology requirements (such as expanded data storage).
2. Develop relationships with data marketplaces and aggregators
While online searches may appear to be an easy way for data-sourcing teams to find individual data sets, that approach is not necessarily the most effective. It generally leads to a series of time-consuming vendor-by-vendor discussions and negotiations. The process of developing relationships with a vendor, procuring sample data, and negotiating trial agreements often takes months. A more effective strategy involves using data-marketplace and -aggregation platforms that specialize in building relationships with hundreds of data sources, often in specific data domains—for example, consumer, real-estate, government, or company data. These relationships can give organizations ready access to the broader data ecosystem through an intuitive search-oriented platform, allowing organizations to rapidly test dozens or even hundreds of data sets under the auspices of a single contract and negotiation. Since these external-data distributors have already profiled many data sources, they can be valuable thought partners and can often save an external-data team significant time. When needed, these data distributors can also help identify valuable data products and act as the broker to procure the data.
Once the team has identified a potential data set, the team’s data engineers should work directly with business stakeholders and data scientists to evaluate the data and determine the degree to which the data will improve business outcomes. To do so, data teams establish evaluation criteria, assessing data across a variety of factors to determine whether the data set has the necessary characteristics for delivering valuable insights . Data assessments should include an examination of quality indicators, such as fill rates, coverage, bias, and profiling metrics, within the context of the use case. For example, a transaction data provider may claim to have hundreds of millions of transactions that help illuminate consumer trends. However, if the data include only transactions made by millennial consumers, the data set will not be useful to a company seeking to understand broader, generation-agnostic consumer trends.
3. Prepare the data architecture for new external-data streams
Generating a positive return on investment from external data calls for up-front planning, a flexible data architecture, and ongoing quality-assurance testing.Up-front planning starts with an assessment of the existing data environment to determine how it can support ingestion, storage, integration, governance, and use of the data. The assessment covers issues such as how frequently the data come in, the amount of data, how data must be secured, and how external data will be integrated with internal data. This will provide insights about any necessary modifications to the data architecture.
Modifications should be designed to ensure that the data architecture is flexible enough to support the integration of a continuous “conveyor belt” of incoming data from a variety of data sources—for example, by enabling application-programming-interface (API) calls from external sources along with entity-resolution capabilities to intelligently link the external data to internal data. In other cases, it may require tooling to support large-scale data ingestion, querying, and analysis. Data architecture and underlying systems can be updated over time as needs mature and evolve.The final process in this step is ensuring an appropriate and consistent level of quality by constantly monitoring the data used. This involves examining data regularly against the established quality framework to identify whether the source data have changed and to understand the drivers of any changes (for example, schema updates, expansion of data products, change in underlying data sources). If the changes are significant, algorithmic models leveraging the data may need to be retrained or even rebuilt.
Minimizing risk and creating value with external data will require a unique mix of creative problem solving, organizational capability building, and laser-focused execution. That said, business leaders who demonstrate the achievements possible with external data can capture the imagination of the broader leadership team and build excitement for scaling beyond early pilots and tests. An effective route is to begin with a small team that is focused on using external data to solve a well-defined problem and then use that success to generate momentum for expanding external-data efforts across the organization.
Redefine the new code for GCCs: Winning with AI – strategic perspectives
Global Capability Centers( GCCs) are reflections of strategic components to parent organization’s business imperatives. GCCs are at an inflection point as the pace at which AI is changing every aspect is exponential and at high velocity. The rapid transformation and innovation of GCCs today is driven largely by ability for them to position AI strategic imperative for their parent organizations. AI is seen to the Trojan horse to catapult GCCs to the next level on innovation & transformation. In recent times; GCC story is in a changing era of value and transformative arbitrage.
Most of the GCCs are aiming towards deploying suite of AI led strategies to position themselves up as the model template of AI Center of Excellence. It is widely predicted that AI will disrupt and transform capability centers in the coming decades. How are Global Capability Centers in India looking at positioning themselves as model template for developing AI center of competence? How have the strategies of GCCs transformed with reference to parent organization? whilst delivering tangible business outcomes, innovation & transformation for parent organizations?
Strategic imperatives for GCC’s to consider to move incrementally in the value chain & develop and edge and start winning with AI:
AI transformation :
Artificial Intelligence has become the main focus areas for GCCs in India. The increasing digital penetration in GCCs across business verticals has made it imperative to focus on AI. Hence, GCCs are upping their innovation agenda by building bespoke AI capabilities , solutions & offerings. Accelerated AI adoption has transcended industry verticals, with organizations exploring different use cases and application areas. GCCs in India are strategically leveraging one of the following approaches to drive the AI penetration ahead –
- Federated Approach: Different teams within GCCs drive AI initiatives
- Centralized Approach: Focus is to build a central team with top talent and niche skills that would cater to the parent organization requirements
- Partner ecosystem : Paves a new channel for GCCs by partnering with research institutes , start-ups , accelerators
- Hybrid Approach: A mix of any two or more above mentioned approaches, and can be leveraged according to GCCs needs and constraints.
- Ecosystem creation : Startups /research institutes/Accelerators
One of the crucial ways that GCCs can boost their innovation agenda is by collaborating with start-ups, research institutes , accelerators. Hence, GCCs are employing a variety of strategies to build the ecosystem. These collaborations are a combination of build, buy, and partner models:
- Platform Evangelization: GCCs offer access to their AI platforms to start-ups
- License or Vendor Agreement: GCCs and start-ups enter into a license agreement to create solutions
- Co-innovate: Start-ups and GCCs collaborate to co-create new solutions & capabilities
- Acqui-hire: GCCs acquire start-ups for the talent & capability
- Research centers : GCCs collaborate with academic institutes for joint IP creation, open research , customized programs
- Joint Accelerator program : GCCs & Accelerators build joint program for customized startups cohort
To drive these ecosystem creation models, GCCs can leverage different approaches. Further, successful collaboration programs have a high degree of customization, with clearly defined objectives and talent allocation to drive tangible and impact driven business outcomes.
Differentiated AI Center of Capability :
GCCs are increasingly shifting to competency, capability creation models to reduce time-to-market. In this model, the AI Center of Competence teams are aligned to capability lines of businesses where AI center of competence are responsible for creating AI capabilities, roadmaps and new value offerings, in collaboration with parent organization’s business teams. This alignment and specific roles have clear visibility of the business user requirement. Further, capability creation combined with parent organization’s alignment helps in tangible value outcomes. In several cases, AI teams are building new range of innovation around AI based capabilities and solutions to showcase ensuing GCC as model template for innovation & transformation. GCCs need to conceptualize a bespoke strategy for building and sustaining AI Center of Competence and keep it up on the value chain with mature and measured transformation & innovation led matrices.
AI Talent Mapping Strategy:
With the evolution of analytics ,data sciences to AI, the lines between different skills are blurring. GCCs are witnessing a convergence of skills required across verticals. The strategic shift of GCCs towards AI center of capability model has led to the creation of AI, data engineering & design roles. To build skills in AI & data engineering, GCCs are adopting a hybrid approach. The skill development roadmap for AI is a combination of build and buy strategies. The decision to acquire talent from the ecosystem or internally build capabilities is a function of three parameters – Maturity of GCC s existing AI capabilities in the desired or adjacent areas ,Tactical nature of skill requirement & Availability and accessibility of talent in the ecosystem. There’s always a heavy Inclination towards building skills in-house within GCCs and a majority of GCCs have stressed upon that the bulk of the future deployment in AI areas will be through in-house skill-building and reskilling initiatives. However, talent mapping strategy for building AI capability is a measured approach else can result in being a Achilles heel for GCC and HR leaders.
GCCs in India are uniquely positioned to drive the next wave of growth with building high impact AI center of competence , there are slew of innovative & transformative models that they are working upon to up the ante and trigger new customer experience , products & services and unleash business transformation for the parent organizations. This will not only set the existing GCCs on the path to cutting-edge innovation but also pave the way for other global organizations contemplating global center setup in India.AI is becoming front runner to drive innovation & transformation for GCCs.
AIQRATE in 2020 ….A walk to remember
“Enabling clients reimagine their decision making & accentuate the business performance with AI strategy in a transformation, innovation and disruption driven world”
In today’s fast paced & volatile VUCA world, leaders face unprecedented challenges. They need to navigate through volatility while staying focused on strategy, business performance and culture. Artificial Intelligence is fast becoming a game changing catalyst and a strategic differentiator and almost a panacea to solve large, complex and unresolved problems. To be an AI powered organization, leaders not only need to have a broad understanding of AI strategy, they need to know how and where to use it. AIQRATE advisory services and consulting offerings are designed to enable leaders and decision makers from Enterprises, GCCs, Cloud Providers, Technology players, Startups, SMBs, VC/PE firms, Public Institutions and Academic Institutions to become AI ready and reduce the risk associated with curating, deploying AI strategy and ensuing interventions and increase the predictability of a durable leader’s success.
In the age of the bionic enterprises, AI continues to dominate the technology & business landscape. Under the aegis of transformation, disruption and innovation, AI has several applications and impact areas which usher a new change in how we make decisions in the enterprise and personal spheres. Traditionally, human decisions are to a large extent based on intuition, gut and historical data. In the age of AI, several of our decisions will be taken by algorithms. Leveraging AI, the ability to mimic the human brain and the ensuing ability to sense, comprehend and act will significantly go up and will result in emergence of augmented intelligence in decision making. Enterprises, GCCs, SMBs, Startups and Government Institutions are attempting to harness the power of AI to change the way they do business. All these industry segments are looking at AI becoming the secret sauce behind making them gain a competitive advantage. If you have not started yet, you are already behind the competition, however large or pedigreed you might be.
So, where are you placed on your AI journey? At AIQRATE, we can guide you on your journey of understanding what AI can do for you, embedding it within your business strategy, functional areas and augmenting the decision-making process.
At AIQRATE, we are here to help you with the art of the possible with AI. Through our bespoke AI strategy frameworks, methodologies, toolkits, playbooks and assessments, we will bring seamless Transformation, Innovation and Disruption to your businesses. Leveraging our proven repository of consulting templates and artifacts, we will curate your AI strategic approach roadmap. Our advisory offerings and consulting engagements are designed in alignment with your strategic growth, vision and competitive scenarios.
We are at an inflection point where AI will revolutionize the way we do business. The paradigms of customer, products, offerings, services and competition will change dramatically; and being AI-ready will become a true differentiator. AIQRATE will be your strategic partner to help you to prepare for what’s next in order to stay relevant.
Wish you a great 2021!
Chief Executive Officer
Bangalore , India
AI led Algorithms can decide on how we need to emote, behave, react, transact or interact with an individual – Sameer with SCIKEY
AI led Algorithms can decide on how we need to emote, behave, react, transact or interact with an individual – Sameer with SCIKEY
In an exclusive interaction with SCIKEY, Sameer Dhanrajani, CEO at AIQRATE Advisory & Consulting, speaks about how the future of work will look like enabled by AI, and it’s contribution in building productive teams and the emerging AI trends to watch out for in Post COVID scenario.
“AI led algorithms can decide on how we need to emote, behave, react, transact or interact with an individual,” Sameer Dhanranjani
Sameer is a globally recognized AI advisor, business builder, evangelist and thought leader known for his deep knowledge, strategic consulting approaches in AI space. Sameer has consulted with several Fortune 500 global enterprises, Indian corporations , GCCs, startups , SMBs, VC/PE firms, Academic Institutions in driving AI led strategic transformation and innovation strategies. Sameer is a renowned author, columnist, blogger and four times Tedx speaker. He is an author of bestselling book – AI and Analytics: accelerating business decisions.
In an exclusive interaction with SCIKEY, Sameer Dhanranjani, CEO at AIQRATE advisory consulting, speaks about how the future of work will look like enabled by AI, and it’s contribution in building productive teams and the emerging AI trends to watch out for in Post COVID scenario.
Mr Dhanranjani, you have consulted with several Fortune 500 enterprises, GCCs also start-ups in driving AI-led strategic transformation strategies. What according to you, are the topmost strategic considerations to weigh for managing accelerating business in Post COVID world for a start-up?
The unprecedented times of COVID-19 have brought the aspect of decision making under consideration. This includes tactical, strategic, and operational decision making that is crucial to make the venture more sustainable. Today the use of artificial intelligence is quite high amongst organizations. It can be used by start-up ventures and other outfits to make decisions irrespective of the area that needs decision making.
Most decisions that need to be made strategically are being passed on to artificial intelligence-enabled interventions. The algorithm makes similar decisions based on the previous decisions taken. Algorithms can decide how we need to emote, behave, react, transact or interact with the opposite individual This advancement in AI brings the challenge for organizations to create products and services specific to each customer through hyper-personalization and micro-segmenting. However, it can also be considered as an opportunity for organizations to emerge from the pandemic with newer business models and experiences for customers. Start-ups, especially, can make use of such advancements to reinvent and rejuvenate the organizational ecosystem.
You are known for your passion for Artificial Intelligence and are an author to the bestselling book – AI and Analytics: Accelerating Business Decisions. Tell us where how can AI be strategically significant while building productive teams.
My experience has led me to deal with engagements in the entire value chain of HR, ranging from hiring to engagement to incentivization that has leveraged using AI. It is phenomenal to see how AI can help build, engage, and sustain productive teams. AI can help in hiring through the detection emotions, facial expressions, tone modulations of the interviewee through computer vision and image classification techniques.
In the creation of productive teams, AI can gauge the engagement levels of an employee. It tries to look at the various interventions made by an employee regarding their attendance, participation in virtual meetings, and propensity to ask and engage themselves in conversations. It also keeps in check the number of pauses, intervals, and breaks taken by an employee. Every aspect of the employee is being marked to see how productive, inclusive, as an individual and in teams.
What are the top 5 AI trends to watch out for in Post COVID the scenario of the next one year?
When it comes to AI, the first trend emerging is that AI is not a tool or a technology, but it is now being touted as a strategic imperative for any organization. This means that AI strategies will become an intrinsic part and feature of every organisation.
The second trend is the democratization of AI. There is a possibility of the emergence of an AI marketplace where virtual exchanges related to business problems, demo runs etc. can be conducted. One would actually be able to figure out which algorithm is best for them in customer experience, supply chain etc.
The third trend being the cloud will act as a catalyst for AI proliferation. The propensity for cloud providers to enable AI companies with possible aspects of microservice API’s, Product Solutions will be created on the go. This means that the cloud enablers will have options to see various possibilities specific to their organisation when it comes to AI-specific use cases.
The fourth trend is linked to skilling. AI today is a part of a lot of course curriculums. But what is missing is the whole aspect of how does it get applied? The new courseware will be focused on how is AI implemented, adopted in the organization.
The last fifth trend is decision-making enabled by AI, which means humans will have no option but to upskill and reskill themselves to take a more rational, pragmatic and sanguine approach. So new models, new emerging realities of decision making will emerge.
How is AI powering the Future of Work, what are critical considerations for business and tech leaders considering the rapidly changing business dynamics due to COVID?
The future of work will be about AI and what we call AI plus a set of exponential technologies. This means that every aspect of our performance interaction and our responses will be gauged very manually through these technologies. This indicates that the level of performances in terms of how we go up-to-date needs to be worked upon. The future of work is an ecosystem where one particular employer cannot do it all.
This means that if learning must occur through an external player, it must come through the ecosystem of co-employees and the employer. In the future, we will not be caged as mere professionals doing our job but will be encouraged to push our boundaries to explore more at work. At the same time, transformation, innovation, and disruption will be a part of the future’s performance metrics. They will become a major parameter for the organization to create a mediocre versus proficient employee or a professional. This is where the onus will fall on the employees to ensure that they are not just doing what is being called out, but are going beyond to create what we call a value creation for the organisation.
SCIKEY Market Network is a Digital Marketplace for Jobs, Work Business solutions, supported by a Professional Network and an integrated Services Ecosystem. It enables enterprises, businesses, job seekers, freelancers, and gig workers around the world. With its online events, learning certifications, assessments, ranking awards, content promotion tools, SaaS solutions for business, a global consulting ecosystem, and more, companies can get the best deals in one place.
‘SCIKEY Assured,’ a premium managed services offering by SCIKEY, delivers the best outcomes to enterprise customers globally for talent and technology solutions getting delivered offshore, remotely, or on-premise. We are super-proud to be working with some of the world’s most iconic Fortune1000 brands.
Better Work. Better Business. Better Life. Better World.
Bring in Effective Data Norms
What constitutes ‘fair use’ of data is increasingly coming under scrutiny by regulators across the world. With the digital detonation that has been unleashed in the past few years, leading to a deluge of data – organisations globally have jumped at the prospect of achieving competitive advantage through more refined data mining methods. In the race for mining every bit of data possible and using it to inform and improve algorithmic models, we have lost sight of what data we should be collecting and processing. There also seems to be a deficit of attention to what constitutes a breach and how offending parties should be identified and prosecuted for unfair use.
There’s growing rhetoric that all these questions be astutely addressed through a regulation of some form. With examples of detrimental use of data surfacing regularly, businesses, individuals and society at large are demanding an answer for exactly what data can be collected – and how it should be aggregated, stored, managed and processed.
If data is indeed the new oil, we need to have a strong understanding of what constitutes the fair use of this invaluable resource. This article attempts to highlight India’s stance on triggering regulatory measures to govern the use of data.Importance of Data Governance
Importance of Data Governance
Before we try to get into what data governance should mean in the Indian context, let us first look at the definition of data governance and why it is an important field of study to wrap our head around.
In simple terms, data governance is the framework that lays down the strategy of how data is used and managed within an organisation. Data governance leaders must stay abreast of the legal and regulatory frameworks specific to the geographies that they operate in and ensure that their organisations are compliant with the rules and regulations. A lot of their effort at present is aimed at maintaining the sanctity of organisational data and ensuring that it does not fall in the wrong hands. As such, the amount of time and effort expended on ensuring that these norms are adequately adhered to is contingent upon the risk associated with a potential breach or loss of data.
In effect, a framework of data governance is intended to ensure that a certain set of rules is applied and enforced to ensure that data is used in the right perspective within an organisation.
Data Governance in Indian Context
India is rapidly moving towards digitisation. Internet connectivity has exploded in the last few years, leading to rapid adoption of internet-enabled applications — social media, online shopping, digital wallets etc. The result of this increasing connectivity and adoption is a fast-growing digital footprint of Indian citizens. Add to this the Aadhaar programme proliferation and adoption – and we have almost every citizen that has personal digital footprint somewhere – codified in the form of data.
With a footprint of this magnitude, there is an element of risk attached. What if this data falls in the wrong hands? What if personal data is used to manipulate citizens? What are the protection mechanisms citizens have against potential overreach by stewards of the data themselves? It is time we found answers to these very pertinent questions – and data governance regulation is the way we will find comprehensive answers to these impending conversations
Perspectives for India
The pertinent departments are mulling over on a collective stand that should be taken while formulating data governance norms. For one, Indian citizens are protected by a recent Supreme Court ruling that privacy is a fundamental right. This has led to a heightened sense of urgency around arriving at a legislative framework for addressing genuine concerns around data protection and privacy, as well as cybersecurity.
As a result of these concerns, the Central government recently set up a committee of experts, led by Justice BN Srikrishna, tasked with formulating data governance norms. This committee is expected to maintain the delicate balance between protecting the privacy of citizens and fostering the growth of the digital economy simultaneously. Their initial work – legal deliberations and benchmarking activity against similar legal frameworks such as GDPR (General Data Protection Regulation) – has resulted in the identification of seven key principles around which any data protection framework needs to be built. Three of the most crucial pointers include:
1. Informed Consent: Consent is deemed to be an expression of human autonomy. While collecting personal data, it is critical that the users be informed adequately about the implications around how this data is intended to be used before capturing their express consent to provide this data
2. Data Minimisation: Data should not be collected indiscriminately. Data collected should be minimal and necessary for purposes for which the data is sought and other compatible purposes beneficial for the data subject.
3. Structured Enforcement: Enforcement of the data protection framework must be by a high-powered statutory authority with sufficient capacity. Without statutory authority, any remedial measures sought by citizens over data privacy infringement will be meaningless.
Striking the right balance between fostering an environment in which the digital economy can grow to its full potential, whilst protecting the rights of citizens is extremely difficult.
With a multitude of malafide parties today seeking to leverage personal data of citizens for malicious purposes, it is crucial that the government and the legal system set out a framework that protects the sovereignty and interests of the people. By allaying fears of misuse of data, the digital economy will grow as people become less fearful and more enthusiastically contribute information where a meaningful end outcome can be achieved.
Envisioning the future of work in the AI era
The age of Artificial Intelligence is upon us. Businesses and society are now looking towards AI for transformative outcomes. Businesses specifically are investing huge amounts of money on AI technology that will not only bring in efficiencies across multiple processes, but also unlock new revenue streams that will deliver quantum bottom-line impact. With the AI transformation playing out rapidly in our personal and professional lives, we need to deeply understand what the future of work will look like in the age of AI.
Within the business organization, there is a huge need to ramp up skill development interventions. The traditional roles of employees in an organization are rapidly changing as they are expected to stay in step with the developments in the world of AI. Business executives are now needed to deeply understand the potential of Artificial Intelligence and translate it into a viable roadmap for their business. Technology leaders need to take centre-stage in how their organizations adopt and harness the power of AI. The CIO is now fast becoming the key custodian of the most valuable resource in business today i.e. data. We are seeing a fast proliferation of digital evangelists and transformation officers who are charged with developing a framework within which the future of the organization will operate.
Ushering the Future of Work
On a tactical level, the burning question now is how subjects such as Data Science, Artificial Intelligence and Machine Learning can be infused in the career pathways of existing employees. How can organizations can build a steady pipeline of future talents with expertise in AI? Mastery of exponential technologies (AI, cloud computing, blockchain, IOT, cybersecurity etc.) will be remarkably important for both business and technical professionals. It is critical that transformation leaders and digital evangelists are well-versed in building internal capabilities that converge around the nexus of technology competencies, managing a hybrid workforce and ensuring the adoption and dispersion of AI.
For us to usher in the future of work powered by Artificial Intelligence, we need to ensure that a few key enablers come together. We need to expand the scope of executive education and the courseware that goes with it. Next, we need to seriously consider the potential impact of shorter, tactical courses. Corporations need to augment their training programs with shorter, time-boxed courseware that can deliver instant impact for the organization. Finally, we need to reimagine multiple, personalized career pathways. We need to move away from the traditional one-size-fits-all training and deliver more tailored, fit-for-purpose and relevant education to employees. Here are the three critical interventions for the business and technology leaders to execute in order to usher in the future of work that is enabled by AI.
1.Develop New Age Skills and Competencies in AI Technology
Upgrading the technology competencies and skills of business and technology leaders and their teams seems like the most critical first step. With the landscape of technology is rapidly evolving, we need to urgently upskill the present and future workforce to ensure a quality supply of talent. We need new age coursework in computer science that can hugely develop the ability of students in subjects such as Artificial Intelligence Machine Learning, Deep Learning, Natural Language Processing and other AI related concepts. On a broader scale, we also need Universities and colleges to improve the existing knowledge-base of AI enabling technologies such as Cloud, DevOps, Blockchain etc as well for the workforce.
At present we see a decent level of advancement in the field of computer science training and education. However, other trades within the technical area which also require to be upgraded as well. By doing so, we will be able to ensure a wholesome and future-proof education for the aspirants who wish to build their careers in the world of AI. For instance, students studying for a major in the field of electronics could shape their focus on mastering AI-enabling technologies such as GPUs and Quantum Computing. The students presently pursuing a specialization in mechanical engineering could achieve some level of sophistication in allied subjects of robotics and 3D Printing. Subject matter experts in the fields of industrial engineering, operations and supply chain would also do well to extend their skill sets to machine learning and blockchain as well thus creating a convergence of their interest areas and realities of the market – which will empower them with the required tools to succeed in the workplace of the future.
2. Reimagining the Process of Developing of New Age Technology
This interventions pertains to the embedding the design in the process of development and user adoption of AI technology. A commonly held misconception around design of a product or software is that it is restricted to simply the look or feel of the product or software. This is simply not true. As a Steve Jobs once proclaimed – Design is not just what is looks like and feels like. Design is how it works.
For the growth of AI to live up to the hype, we need to reimagine the process by which we develop new age technology. We need to build design into the fabric of the development and engagement process to ensure that the conceived idea is brought to fruition. Transformation evangelists aiming to spearhead the future of work should treat design as the creative process that aids the development of breakthrough products.
We are already seeing several inroads that design frameworks such as Human Centered Design and Empathy-led Design are making in the technology realm. These frameworks not only guide the development process, but also the user experience of the final software / hardware being developed. These frameworks do so by putting the user at the center of the journey.
3.Managing the ‘People’ of the Future Workforce
As I mentioned before the understanding of traditional roles in the future of work is rapidly changing. New roles are also emerging where data custodians and algorithm at scale engineers are put to work to develop the technology that powers the business of the future. On the macro level, we are seeing rapid changes in the paradigm of staffing as well. With the gig economy in full force, we are seeing more dynamic team compositions – where individuals with varied skill sets are required to continuously augment teams on a need basis. Advances in the fields of technology and management typically ordain large-scale transformation in the manner in which organizations manage their workforce.
On the micro level we are seeing that increased instances of automation are requiring managers to build and scale blended teams comprising humans and AI. This disruption requires a paradigm shift how the future workforce is managed. Teams in the future will showcase increased diversity and will be more interdisciplinary than ever before. Managing teams, careers and coaching for improved performance in the future will require a new set of metrics. Change evangelists need to devise these metrics – which will be imperative to how the workforce of the future is managed.
New technologies will require new approaches to project management and staffing. To ensure the supply of these critical skills, we also need courses that provide an education of subjects such as people management.
Our very understanding of our workplace is being rapidly disrupted. Increasingly a convergence of the right people, process and technology is required to unearth insights from a seemingly exponentially increasing size of data. To turn this data into actionable intelligence that powers business processes must be the focus of business and technology leaders – as well as educationists that build the talent pipeline for the future. Academia is required to urgently intervene and provide theoretical and practical training in AI subjects to both the existing workforce and the future pipeline of talent. We also need a dispersion of soft skills that will enable and evangelize this change. With growing interest and appreciation of technologies and platforms around Artificial Intelligence and the Digital Workplace, organizations need to ask tough questions of themselves. The time is now to consider the various forces at play. With increased AI augmentation and the transformation of processes and people that enable it, the topic of the Future of Work requires immediate and urgent attention.
Redesigning exponential technologies landscape with AI & Blockchain fusion
AI and blockchain are two of the prime drivers in the technology space that catalyze the pace of innovation and demonstrating radical shifts across every industry. Each of this technical venture comes with a degree of technical complexity and business implications. Fusion of the two will be able to redesign the entire technical landscape along with a human effect from scratch.
Blockchain has its own limitations, it is a mix of technology-related and culture influence from the financial services sector, but most of them can be conceited by AI in a way or another.
The illustrated points below will be able to give a gist of the potentials that can be realized at the intersection of AI and Blockchain:
Energy consumption in mining: Mining has already proven that it requires tons of energy and is heavy in the economic perspective. AI has mastered in optimizing energy consumption across multiple sectors, similar results can be expected for the blockchain as well. AI can dramatically reduce the costs of maintaining servers and validate potential savings to lower investments in mining hardware.
Federated Learning: Blockchain is growing at a steady pace of 1MB every 10 minutes. Blockchain pruning is a possible solution through AI. A new decentralized learning system such as federated learning, for example, or new data sharing techniques to make the system more efficient.
Security: Concerns still exist on the security system of built-in layers and applications for Blockchain (e.g., the DAO, Bitfinex, etc.). The mileage created by machine learning in the last two years makes AI a solid candidate for the blockchain to guarantee secure applications deployment, especially given the fixed structure of the system.
Blockchain-AI Data gates: Blockchain has proven its ability for record keeping, authentication, and execution while AI drives decisions by assessing/understanding patterns and datasets, ultimately engendering autonomous interaction. The combo (AI and blockchain) will be become a data gate with these several characteristics that will ensure a seamless interaction in the nearest future.
Auditing of AI through blockchain: AI is seen as a black box ( complex set of calculations and algorithms) to distinguish patterns or trends. This makes it a difficult task for the humans to govern the choices taken by the artificial intelligence in yielding results. Accountability of the AI black box is seen as biggest challenge, considering concerns across the community for tampering or the altering happening to the calculations for the given input which eventually reflects in the output generated. This challenge can be easily comprehended by the blockchain innovation. Implementing robust auditing of these calculations utilizing the blockchain is seen as the biggest driver for enhancing the credibility of the business organizations and reinstating trust in the reliability of the information.
Leverage on Artificial Trust: Future roadmap of this fusion can successfully lead into creation of virtual agents that will create new ledger by themselves. Machine to machine interaction will be the new norm reinstating trust in a secure way to share data and coordinate decisions, as well as a robust mechanism to reach a quorum.
Machine performance monitoring and changes: Blockchain miners (companies and individuals) pour an incredible amount of money into specialized hardware components. AI can complement such as machine/equipment monitoring to deploy more efficient systems and do away with the unproductive heavy ones.
Blockchain for better information management: AI has a proven mechanism that runs of an incorporated or centralized database. In such a case, there are always chances for information occurrence of a mishap, i.e. gets lost, altered, or undermined.
Blockchain and artificial intelligence fusion can eliminate the above concern. Under the umbrella of blockchain the data is decentralized and stored within different nodes or systems. This reinstates trust on that your information is safe and unaltered. Most importantly the information is time-stamped and is in the sequence making recuperation less demanding and exact.
Some key challenges on the block: The fusion throws open technical and ethical implications arising from the interaction between these two technologies, such as the need to edit data on a blockchain and most importantly the duo pushing to become data hoarder. Experimentations alone will be able to provide a detailed answer on these lines.
In conclusion blockchain and AI are the two sides of the technology spectrum. One efficiently fosters centralized intelligence while the other promotes decentralized applications in an open-data environment. The fusion of the two will be an intelligent way to amplify positive externalities and advance mankind, most importantly reap the maximum potential for business needs.
Beating Back Cyber Attacks with Analytics – A Topical Perspective
The worldwide cyber attack that began last Friday and goes by the name of “WannaCry” has highlighted the need for governments and businesses to strengthen their security infrastructure, in addition to calling attention to the need to mandate security updates and educate lawmakers about the intricacies of cyber security.
During the WannaCry attacks, hospitals had to turn away patients, and their ability to provide care was altered significantly. Even though the threat is widely acknowledged to be real by the information security community and anyone not living under a rock, and the stakes are higher than ever, most organizations and almost all healthcare providers are still using old-school cybersecurity technologies and retain their reactive security postures.
The WannaCry ransomware attack moved too quickly for security teams to respond, but a few organizations were able to spot the early indicators of the ransomware and contain it before the infection spread across their networks. While it wreaked havoc across the globe, there was nothing subtle about it. All of the signs of highly abnormal behavior on the networks were there, but the pace of the attack was far beyond the capacity of human teams contain it. The latest generation of AI technology enabled those few organizations to defend their networks at the first sign of threat.
Meanwhile, threats of similar – or perhaps worse – attacks have continued to surface. This was not the big one. This was a precursor of a far worse attack that will inevitably strike — and it is likely, unfortunately, that [the next] attack will not have a kill switch. This is an urgent call for action for all of us to get the fundamentals finally in place to enable us to withstand robustly this type of a crisis situation when the next one hits.
Modern malware is now almost exclusively polymorphic and designed in such a way as to spread immediately upon intrusion into a network, infecting every sub-net and system it encounters in near real-time speed. Effective defense systems have to be able to respond to these threats in real time and take on an active reconnaissance posture to seek out these attacks during the infiltration phase. We now have defense systems that have applied artificial intelligence and advanced machine learning techniques and are able to detect and eradicate these new forms of malware before they become fully capable of executing a breach, but their adoption has not matched the early expectations.
As of today, the vast majority of businesses and institutions have not adopted nor installed these systems and they remain at high risk. The risk is exacerbated further by targets that are increasingly involved with life or death outcomes like hospitals and medical centers. All of the new forms of ransomware and extortionware will increasingly be aimed at high-leverage opportunities like insulin pumps, defibrillators, drug delivery systems and operating room robotics.
Network behavioral analytics that leverage artificial intelligence can stop malware like WannaCry and all of its strains before it can form into a breach. And new strains are coming. In fact, by the time this is published, it would not surprise me to see a similar attack in the headlines.
Aanlytics is Turning the Table on Security Threats
The more comprehensive, sensitive and greater volume of end user and customer data you store, the more tempting you are to someone wanting to do harm. That said, the same data attracting the threat can be used to thwart an attack. Analytics includes all events, activities, actions, and occurrences associated with a threat or attack:
- User: authentication and access location, access date and time, user profiles, privileges, roles, travel and business itineraries, activity behaviors, normal working hours, typical data accessed, application usage
- Device: type, software revision, security certificates, protocols
- Network: locations, destinations, date and time, new and non-standard ports, code installation, log data, activity and bandwidth
- Customer: customer database, credit/debit card numbers, purchase histories, authentication, addresses, personal data
- Content: documents, files, email, application availability, intellectual property
The more log data you amass, the greater the opportunity to detect, diagnose and protect an organization from cyber-attacks by identifying anomalies within the data and correlating them to other events falling outside of expected behaviors, indicating a potential security breach. The challenge lies in analyzing large amounts of data to uncover unexpected patterns in a timely manner. That’s where analytics comes into play.
Leveraging Data Science & Analytics to Catch a Thief
Using data science, organizations can exercise real-time monitoring of network and user behaviors, identifying suspicious activity as it occurs. Organizations can model various network, user, application and service profiles to create intelligence-driven security measures capable of quickly identifying anomalies and correlating events indicating a threat or attack:
- Traffic anomalies to, from or between data warehouses
- Suspicious activity in high value or sensitive resources of your data network
- Suspicious user behaviors such as varied access times, levels, location, information queries and destinations
- Newly installed software or different protocols used to access sensitive information
- Identify ports used to aggregate traffic for external offload of data
- Unauthorized or dated devices accessing a network
- Suspicious customer transactions
Analytics can be highly effective in identifying an attack not quite underway or recommending an action to counter an attack, thus minimizing or eliminating losses. Analytics makes use of large sets of data with timely analysis of disparate events to thwart both the smallest and largest scale attacks.
The Analytics Solution to Security Monitoring
If security monitoring is a data storage problem, then it requires a analytics solution capable of analyzing large amounts of data in real time. The natural place to look for that solution is within Apache Hadoop, and the ecosystem of dependent technologies. But although Hadoop does a good job performing analytics on large amounts of data, it was developed to provide batch analysis, not real-time streaming analytics required to detect security threats.
In contrast, the solution for real-time streaming analytics is Apache Storm, a free and open source real-time computation system. Storm functions similar to Hadoop, but was developed for real-time analytics. Storm is fast and scalable, supporting not only real-time analytics but machine learning as well, necessary to reduce the number of false positives found in security monitoring. Storm is commonly found in cloud solutions supporting antivirus programs, where large amounts of data is analyzed to identify threats, supporting quick data processing and anomaly detection.
The key is real-time analysis. Big data contains the activities and events signaling a potential threat, but it takes real-time analytics to make it an effective security tool, and the statistical analysis of data science tools to prevent security breaches.
When do you need to start? – Yesterday
Yesterday would have been a good time for companies and institutions to arm themselves against this pandemic. Tomorrow will be too late.