Training Programs Can Enhance The Skills And Employability Of The Existing IT Workforce
Several pieces of research, studies and content have been published about the exponential growth witnessed in emerging technologies – AI, blockchain, RPA, cybersecurity, IoT, AR/VR. There is no doubt that these are the emerging technologies of the future, which will help catapult the next growth spurt of enterprises. Professionals across the information technology landscape are queuing up to upskill, reskill themselves across these mentioned areas to continue to be relevant in the workforce for tomorrow. But for those who already have learning, skills and experience in these emerging technologies, how can they take their career to the next level?
The flux of deployment-ready and value-generating use cases across industries suggest that a cross-technology expertise across these emerging technology areas would be the next big source of career growth for incumbent professionals. We witnessed few years ago, software developers were keen to reinvent themselves as full-stack developers. High performing technologists wanted to develop proficiency across the software architecture and the development life-cycle – from database to UI, and from infrastructure set-up to deployment. Similarly, IT professionals today should seek the synergistic benefits of combining areas across emerging technologies. This article will focus on what emerging technology areas are being effectively combined by enterprises.
AI + Blockchain
Artificial intelligence is the set of technologies that help machines mimic human functions. Blockchain is the emergent technology paradigm that helps build a distributed, immutable sequence of financial events and transactions. With a strong uptick in the dispersion of blockchain use cases, enterprises are also looking for a robust way to surface potential fraud and other anomalous events in real-time. The events of fraud attempt and security threats to blockchain systems are often very high-speed and require immediate attention and analysis to ensure that the perceived anomalies are rooted out. Using AI, specifically machine learning, we can rapidly parse through a log of events to find anomalous situations and flag them off in real-time, protecting the integrity of the blockchain.
Internet of Things is the network of physical devices that exchange data. This very definition makes the case for combining artificial intelligence and IOT plainly clear. IOT-enabled sensors usually are a source of multitudinous data – based on the use case employed – which is increasingly being sent to the central controlling server in real-time, rather than in batches. Picking out key inferences from voluminous data, sent in real-time by numerous sensors is a task that is again handed over to AI systems – typically machine learning systems. While IOT systems can ably sense, transmit and store data, ML systems are required for making sense of the data and providing input on whether any action is required to be taken, and potentially even suggesting what best-case action could be taken.
IOT + Smart Cities
While the concept of a Smart City is a sub-segment of IOT, it’s the other way around. IOT forms only one of the component of powering a smart city. In addition to IOT, the Smart City stack would typically also include cloud (for running processes and storing data), Artificial Intelligence (for data analysis and learning) and an element of urban planning (for deciding the what and how of a Smart City design). By combining knowledge of IOT with these other ancillary areas can help IOT professionals become valuable and irreplaceable resources in this fast-growing technology area.
AI + Behavioral Sciences
This final combination may sound surprising, but one of the most valuable and high-impact grouping of skills might just be the combination of data science and behavioral science. While AI and data science can provide the what (‘What happened?’ And ‘What should we do now?’) of a business scenario, behavioral sciences inform the how part. Consider the example of Amazon, which has numerous examples of the coming together of behavioral sciences and AI. The recommendation engine uses artificial intelligence to answer the what part of ‘what are other people buying’. But, the idea that it will lead to continued stickiness on the website, serve as a wide showcase of SKUs available on it, while promoting product bundling and larger cart sizes is a clear behavioral sciences intervention and contributes massively to the success of AI applications.
If you are a professional already conversant with one of the emerging technology areas, combining that with another emerging area can be hugely beneficial. IT professionals today in these areas should strongly consider leveraging synergies across multiple technology areas – which can help them be better-rounded, high-value practitioners in an ever evolving areas of technology.
Getting Started With A Career In Artificial Intelligence: Quick Primer
The last few years have seen Artificial Intelligence capturing the imagination of corporate executives and catapulted into the mainstream of the business world. With a myriad, and ever-expanding set of applications, AI promises to provide a quantum leap in enterprise efficiency, profitability, and competitiveness. Due to decreasing costs of storage, increasingly efficient algorithms running atop chipsets more powerful than over before, AI is witnessing a huge surge in interest and applicability. As companies rush to co-opt AI into their processes, practitioners of this technology are in high demand – which easily outstrips high-quality supply. With a soaring growth in demand and supply struggling to catch up, it is natural for professionals of the current and future workforce to ask – how to get started with a career in AI?
It is important to put down some context. Before answering where one can get started, it is important to first define AI. A simple, yet comprehensive working definition for AI is – the ability of machines to mimic human intelligence and functions. Going one step downstream, building a truly artificially intelligent machine is to equip it with the ability to sense and comprehend ‘stimuli’ within its environment, identify and weigh response options for acting on the stimuli, performing the suggested action, and continuously learning from the impact of the action taken, in a way that informs future decision-making.
Parsing this definition further, AI happens at the intersection of data (represented as the stimulus provided and the feedback loops for learning), mathematics (represented through models which weigh up decision-points and payoffs for each prospective action) and computer science (the technical and logical backbone that governs the flow of data and codifies potential action points). These are the three key ingredients of building powerful AI – and the three areas aspirants to this industry need to master.
Let us double-click on these three areas to understand their criticality to AI systems, and how the workforce can build competencies in each area.
While we can split hairs over the appropriate terminology (some prefer to call it Data Science, while others call it Data Engineering – depending on how teams are structured), it is important to focus more on the nature of the skill required in the AI arena.
Data skills encompass the entire range of tasks associated with data management for AI – the collection, sorting, storage, and extraction of data for meaningful use. It is data that fuels the growth of an AI application, and therefore the ability to sense incoming data, identify patterns therein and make informed decisions is a crucial building block for a career in AI.
Given the criticality of core data skills, it is not surprising to see data-literate employees – analytics professionals and data engineers – try their hand at reinventing their careers in this domain. Those who do not have a background in these two techniques should get started with courses in business analytics – to understand how businesses slice-and-dice data to inform their decision-making process. Those who have some background of computer science should upskill in data engineering areas i.e. how to effectively leverage emerging concepts in database management to improve storage, management, and extraction of data to feed AI applications in the most efficient manner. Alternatively, computer engineers could also learn business analytics to understand the applications and implications of data for business decisions.
Put simply, numeracy is the ability or skill to work with numbers and mathematical concepts. This is the second key ingredient for a successful career in AI. As I previously mentioned, a key building block of AI is to build the ability to weigh multiple options, probabilities, and payoffs across multiple options, to take the most optimal decision. These are essentially mathematical concepts of inference, probability, decision trees and game theory – and fine-tuning these skills are a critical part of building a great career in AI.
Developing advanced numeracy skills is a natural option for those who are mathematically inclined and have an education therein. Those who don’t have formal education in these areas can rely on numerous online courses that teach statistics and probability, before moving towards more advanced concepts. The takeaway from your education in numeracy should be the ability to formulate optimal pathways to decisions, identifying and accurately scoring multiple options, suggesting responses and continuously informing the mathematical model through a feedback loop, based on the results of responses delivered.
The final piece is to ramp up existing computer science skills to align with the needs of AI application development. There are two sub-areas at play here, namely – conceptualizing the logic (algorithms) and writing the language (code). Computer science provides the fundamental backbone required for improving the scalability and resilience of AI applications. It dictates how the data is operationalized and provides the logical base for mathematical models to process the data.
Python and R are two widely accepted languages in the field of AI. As a lot of existing developing in this domain has been done in these two languages, they provide rich libraries which are a key starting point to AI applications. Those who have a strong inclination and education in programming are highly advised to pick up online courses that provide hands-on skills in these two languages. Computer scientists well-versed with these two languages can also consider expanding their breadth into the numeracy skills – as these two works well in tandem and offer much better job opportunities in AI.
Like AI itself, a career in AI requires one to commit to continuous learning. This field, like any other emerging field, is rapidly evolving with new models and applications coming to light almost every day. While mastery of the above three skills is a good start, it is important to stay constantly updated to stay ahead of the curve. One way to do that is to keep an eye on research papers submitted therein. Additionally, it is equally important to keep an eye on the business end and staying updated on emerging use cases in this arena.
Disclaimer: The views expressed in the article above are those of the authors’ and do not necessarily represent or reflect the views of this publishing house
It Is Time To Shape The Future Of Education
Technology proliferation and changing socio-economic factors is ushering tumultuous change in the old paradigm of work. Today, with the anvil of ‘gig-economy’ – a collective talent marketplace of independent workforce working on recurring short-term assignments; we are now at the definitive cusp of a new reality of the workforce – working professionals will not only change jobs but will take multiple career switches while being expected to continuously unlearn and relearn new skills along the way.
The future of work is here. It is time to shape the future of education.
According to multiple studies, success in the gig economy will be centered around 3 competencies which I call the 3 C’s – Creativity, Curiosity, and Collaboration. While children are naturally curious and creative, it is more important than ever before for academia nurture and sharpens these two qualities, while adding a core competency of collaboration, by imbibing them in their teaching methods.
In the continuum, I strongly believe that education in the time to come will go in for a fundamental change. Here’s my take on the future of education:
According to a World Economic Forum report, there is enormous potential to improve the social and emotional skills of students by incorporating the use of play in their education – which in turn can provide a boost to their collaborative skills and drive curiosity. Developing these skills will require three types of games, namely
Role-playing Games -creating a narrative arc through a sequence of events and providing them with a variety of options for interacting with the game through their characters. Role-playing games also allow students to explore multiple paths and revisit previously explored times and experiences.
Strategy Games -multiple students partaking in a quest to manage the strategic planning and deployment of scarce resources
Sandbox Games – focusing on open-ended exploration, being resourceful and taking initiative among a group of players to achieve a shared goal.
It may no longer be productive to attend 3 -4 years of graduate school, followed by post-graduate education. In the gig economy, students and corporates will unlock shared benefits of skills-centric learning, followed by a stint at the workplace, before going back to school and acquiring new skills. While this will reduce the time and cost of learning; it will also help students apply their skills in the workforce and gain the much-needed hands-on experience. By seeing their classroom learnings in action, it will also spur curiosity to learn more and do more in the future.
Increased Mobility Between Institutes
While our generation uses MOOCs for furthering our education, MOOCs will become mainstream for future generations. MOOCs provide a wonderful counterweight to the natural curiosity of students while helping institutions extend their curricula into subjects they currently do not have the capacity to address. MOOCs will also become more social and collaborative, encouraging students to learn with each other and improve their overall performance.
We will also see a rise in Virtual Reality (VR) and robots in the classroom. VR will help create more immersive learning experiences for students, thus stroking their natural curiosity to learn. Robots, on the other hand, will take the scud work from teachers – and provide inputs on skills assessments, personalized curriculum pathways, and attention tracking – allowing teachers to focus more on coaching and mentoring.
The future of education will define how our next generations shape up and succeed in the workplace. It is critical that we understand the value of developing the 3C’s – Creativity, Curiosity, and Collaboration – from an early age so that our next generation can achieve their full potential and value in the workforce.
How Rise of Exponential Technologies – AI, RPA, Blockchain, Cybersecurity will Redefine Talent Demand & Supply Landscape
How Rise of Exponential Technologies – AI, RPA, Blockchain, Cybersecurity will Redefine Talent Demand & Supply Landscape
The current boom of exponential technologies of today is causing strong disruption in the talent availability landscape, with traditional, more mechanical roles being wiped out and paving way for huge demand for learning and design thinking based skills and professions. The World Economic Forum said in 2016 that 60% of children entering school today will work in jobs that do not yet exist.
While there is a risk to jobs due to these trends, the good news is that a huge number of new jobs are getting created as well in areas like AI, Machine Learning, Robotic Process Automation (RPA), Blockchain, Cybersecurity, etc. It is clearly a time of career pivot for IT professionals to make sure they are where the growth is.
AI and Machine Learning upending the traditional IT Skill Requirement
AI and Machine Learning will create a new demand for skills to guide its growth and development. These emerging areas of expertise will likely be technical or knowledge-intensive fields. In the near term, the competition for workers in these areas may change how companies focus their talent strategies.
At a time when the demand for data scientists and engineers will grow 39% by 2020, employers are seeking out leaders who can effectively work with technologists to ask the right questions and apply the insight to solve business problems. The business schools are, hence, launching more programs to equip graduates with the skills they need to succeed. Toronto’s Rotman School of Management, for example, last week launched a nine-month program which provides recent college graduates with advanced data management, analytical and communication skills.
According to the Organization of Economic Cooperation and Development, only 5-10% of labor would be displaced by intelligent automation, and new job creation will offset losses.
The future will increase the value of workers with a strong learning ability and strength in human interaction. On the other hand, today’s highly paid, experienced, and skilled knowledge workers may be at risk of losing their jobs to automation.
Many occupations that might appear to require experience and judgment — such as commodity traders — are being outdone by increasingly sophisticated machine-learning programs capable of quickly teasing subtle patterns out of large volumes of data. If your job involves distracting a patient while delivering an injection, guessing whether a crying baby wants a bottle or a diaper change, or expressing sympathy to calm an irate customer, you needn’t worry that a robot will take your job, at least for the foreseeable future.
Ironically, the best qualities for tomorrow’s worker may be the strengths usually associated with children. Learning has been at the centre of the new revival of AI. But the best learners in the universe, by far, are still human children. At first, it was thought that the quintessential preoccupations of the officially smart few, like playing chess or proving theorems — the corridas of nerd machismo — would prove to be hardest for computers. In fact, they turn out to be easy. Things every dummy can do like recognizing objects or picking them up are much harder. And it turns out to be much easier to simulate the reasoning of a highly trained adult expert than to mimic the ordinary learning of every baby. The emphasis on learning is a key change from previous decades and rounds of automation.
According to Pew Research, 47% of all employment opportunities will be occupied by machines within the next two decades.
What types of skills will be needed to fuel the development of AI over the next several years? These prospects include:
- Ethics: The only clear “new” job category is that of AI ethicist, a role that will manage the risks and liabilities associated with AI, as well as transparency requirements. Such a role might be imagined as a cross between a data scientist and a compliance officer.
- AI Training: Machine learning will require companies to invest in personnel capable of training AI models successfully, and then they must be able to manage their operations, requiring deep expertise in data science and an advanced business degree.
- Internet of Things (IoT): Strong demand is anticipated for individuals to support the emerging IoT, which will require electrical engineering, radio propagation, and network infrastructure skills at a minimum, plus specific skills related to AI and IoT.
- Data Science: Current shortages for data scientists and individuals with skills associated with human/machine parity will likely continue.
- Additional Skill Areas: Related to emerging fields of expertise are a number of specific skills, many of which overlap various fields of expertise. Examples of potentially high-demand skills include modeling, computational intelligence, machine learning, mathematics, psychology, linguistics, and neuroscience.
In addition to its effect on traditional knowledge workers and skilled positions, AI may influence another aspect of the workplace: gender diversity. Men hold 97 percent of the 2.5 million U.S. construction and carpentry jobs. These male workers stand more than a 70 percent chance of being replaced by robotic workers. By contrast, women hold 93 percent of the registered nurse positions. Their risk of obsolescence is vanishingly small: .009 percent.
RPA disrupting the traditional computing jobs significantly
RPA is not true AI. RPA uses traditional computing technology to drive its decisions and responses, but it does this on a scale large and fast enough to roughly mimic the human perspective. AI, on the other hand, applies machine and deep learning capabilities to go beyond massive computing to understand, learn, and advance its competency without human direction or intervention — a truly intelligent capability. RPA is delivering more near-term impact, but the future may be shaped by more advanced applications of true AI.
In 2016, a KPMG study estimated that 100 million global knowledge workers could be affected by robotic process automation by 2025.
The first reaction would be that in the back office and the middle office, all those roles which are currently handling repetitive tasks would become redundant. 47% of all American job functions could be automated within 20 years, according to the Oxford Martin School on Economics in a 2013 report.
Indeed, India’s IT services industry is set to lose 6.4 lakh low-skilled positions to automation by 2021, according to U.S.-based HfS Research. It said this was mainly because there were a large number of non-customer facing roles at the low-skill level in countries like India, with a significant amount of “back office” processing and IT support work likely to be automated and consolidated across a smaller number of workers.
Automation threatens 69% of the jobs in India, while it’s 77% in China, according to a World Bank research.
Job displacement would be the eventual outcome however, there would be several other situations and dimensions which need to be factored. Effective automation with the help of AI should create new roles and new opportunities hitherto not experienced. Those who currently possess traditional programming skills have to rapidly acquire new capabilities in machine learning, develop understanding of RPA and its integration with multiple systems. Unlike traditional IT applications, planning and implementation could be done in small patches in shorter span of time and therefore software developers have to reorient themselves.
For those entering into the workforce for the first time, there would be a demand for talent with traditional programming skills along with the skills for developing RPA frameworks or for customising the frameworks. For those entering the workforce for being part of the business process outsourcing functions, it would be important to develop capability in data interpretation and analysis as increasingly more recruitment at the entry level would be for such skills and not just for their communication or transaction handling skills.
Blockchain – A blue ocean of a New kind of Financial Industry Skillset
A technology as revolutionary as blockchain will undoubtedly have a major impact on the financial services landscape. Many herald blockchain for its potential to demystify the complex financial services industry, while also reducing costs, improving transparency to reduce the regulatory burden on the industry. But despite its potential role as a precursor to extend financial services to the unbanked, many fear that its effect on the industry may have more cons than pros.
30–60% of jobs could be rendered redundant by the simple fact that people are able to share data securely with a common record, using Blockchain
Industries including payments, banking, security and more will all feel the impact of the growing adoption of this technology. Jobs potentially in jeopardy include those involving tasks such as processing and reconciling transactions and verifying documentation. Profit centers that leverage financial inefficiencies will be stressed. Companies will lose their value proposition and a loss of sustainable jobs will follow. The introduction of blockchain to the finance industry is similar to the effect of robotics in manufacturing: change in the way we do things, leading to fewer jobs, is inevitable.
Nevertheless, the nature of such jobs is likely to evolve. While Blockchain creates an immutable record that is resistant to tampering, fraud may still occur at any stage in the process but will be captured in the record and there easily detected. This is where we can predict new job opportunities. There could be a whole class of professions around encryption and identity protection.
So far, the number of jobs created by the industry appears to exceed the number of available professionals qualified to fill them, but some aren’t satisfied this trend will continue. Still, the study of the potential impact of blockchain tech on jobs has been largely qualitative to date. Aite Group released a report that found the largest employers in the blockchain industry each employ about 100 people.
THE BEST PRACTICES FOR INTERNET OF THINGS ANALYTICS
In most ways, Internet of Things analytics are like any other analytics. However, the need to distribute some IoT analytics to edge sites, and to use some technologies not commonly employed elsewhere, requires business intelligence and analytics leaders to adopt new best practices and software.
There are certain prominent challenges that Analytics Vendors are facing in venturing into building a capability. IoT analytics use most of the same algorithms and tools as other kinds of advanced analytics. However, a few techniques occur much more often in IoT analytics, and many analytics professionals have limited or no expertise in these. Analytics leaders are struggling to understand where to start with Internet of Things (IoT) analytics. They are not even sure what technologies are needed.
Also, the advent of IoT also leads to collection of raw data in a massive scale. IoT analytics that run in the cloud or in corporate data centers are the most similar to other analytics practices. Where major differences appear is at the “edge” — in factories, connected vehicles, connected homes and other distributed sites. The staple inputs for IoT analytics are streams of sensor data from machines, medical devices, environmental sensors and other physical entities. Processing this data in an efficient and timely manner sometimes requires event stream processing platforms, time series database management systems and specialized analytical algorithms. It also requires attention to security, communication, data storage, application integration, governance and other considerations beyond analytics. Hence it is imperative to evolve into edge analytics and distribute the data processing load all across.
Hence, some IoT analytics applications have to be distributed to “edge” sites, which makes them harder to deploy, manage and maintain. Many analytics and Data Science practitioners lack expertise in the streaming analytics, time series data management and other technologies used in IoT analytics.
Some visions of the IoT describe a simplistic scenario in which devices and gateways at the edge send all sensor data to the cloud, where the analytic processing is executed, and there are further indirect connections to traditional back-end enterprise applications. However, this describes only some IoT scenarios. In many others, analytical applications in servers, gateways, smart routers and devices process the sensor data near where it is generated — in factories, power plants, oil platforms, airplanes, ships, homes and so on. In these cases, only subsets of conditioned sensor data, or intermediate results (such as complex events) calculated from sensor data, are uploaded to the cloud or corporate data centers for processing by centralized analytics and other applications.
The design and development of IoT analytics — the model building — should generally be done in the cloud or in corporate data centers. However, analytics leaders need to distribute runtime analytics that serve local needs to edge sites. For certain IoT analytical applications, they will need to acquire, and learn how to use, new software tools that provide features not previously required by their analytics programs. These scenarios consequently give us the following best practices to be kept in mind:
Develop Most Analytical Models in the Cloud or at a Centralized Corporate Site
When analytics are applied to operational decision making, as in most IoT applications, they are usually implemented in a two-stage process – In the first stage, data scientists study the business problem and evaluate historical data to build analytical models, prepare data discovery applications or specify report templates. The work is interactive and iterative.
A second stage occurs after models are deployed into operational parts of the business. New data from sensors, business applications or other sources is fed into the models on a recurring basis. If it is a reporting application, a new report is generated, perhaps every night or every week (or every hour, month or quarter). If it is a data discovery application, the new data is made available to decision makers, along with formatted displays and predefined key performance indicators and measures. If it is a predictive or prescriptive analytic application, new data is run through a scoring service or other model to generate information for decision making.
The first stage is almost always implemented centrally, because Model building typically requires data from multiple locations for training and testing purposes. It is easier, and usually less expensive, to consolidate and store all this data centrally. Also, It is less expensive to provision advanced analytics and BI platforms in the cloud or at one or two central corporate sites than to license them for multiple distributed locations.
The second stage — calculating information for operational decision making — may run either at the edge or centrally in the cloud or a corporate data center. Analytics are run centrally if they support strategic, tactical or operational activities that will be carried out at corporate headquarters, at another edge location, or at a business partner’s or customer’s site.
Distribute the Runtime Portion of Locally Focused IoT Analytics to Edge Sites
Some IoT analytics applications need to be distributed, so that processing can take place in devices, control systems, servers or smart routers at the sites where sensor data is generated. This makes sure the edge location stays in operation even when the corporate cloud service is down. Also, wide-area communication is generally too slow for analytics that support time-sensitive industrial control systems.
Thirdly, transmitting all sensor data to a corporate or cloud data center may be impractical or impossible if the volume of data is high or if reliable, high-bandwidth networks are unavailable. It is more practical to filter, condition and do analytic processing partly or entirely at the site where the data is generated.
Train Analytics Staff and Acquire Software Tools to Address Gaps in IoT-Related Analytics Capabilities
Most IoT analytical applications use the same advanced analytics platforms, data discovery tools as other kinds of business application. The principles and algorithms are largely similar. Graphical dashboards, tabular reports, data discovery, regression, neural networks, optimization algorithms and many other techniques found in marketing, finance, customer relationship management and advanced analytics applications also provide most aspects of IoT analytics.
However, a few aspects of analytics occur much more often in the IoT than elsewhere, and many analytics professionals have limited or no expertise in these. For example, some IoT applications use event stream processing platforms to process sensor data in near real time. Event streams are time series data, so they are stored most efficiently in databases (typically column stores) that are designed especially for this purpose, in contrast to the relational databases that dominate traditional analytics. Some IoT analytics are also used to support decision automation scenarios in which an IoT application generates control signals that trigger actuators in physical devices — a concept outside the realm of traditional analytics.
In many cases, companies will need to acquire new software tools to handle these requirements. Business analytics teams need to monitor and manage their edge analytics to ensure they are running properly and determine when analytic models should be tuned or replaced.
Increased Growth, if not Competitive Advantage
The huge volume and velocity of data in IoT will undoubtedly put new levels of strain on networks. The increasing number of real-time IoT apps will create performance and latency issues. It is important to reduce the end-to-end latency among machine-to-machine interactions to single-digit milliseconds. Following the best practices of implementing IoT analytics will ensure judo strategy of increased effeciency output at reduced economy. It may not be suffecient to define a competitive strategy, but as more and more players adopt IoT as a mainstream, the race would be to scale and grow as quickly as possible.