One of the most important and critical roles of an IT professional is to handle incoming alerts efficiently and effectively. This will ensure a threat-free environment and reduce the chances of system outages. Now, not all incoming alerts are critical; an alert can pop up on a window screen for a user to act on, blocking the underlying webpage. One can configure the setting to automatic alert resolution where an alert will be closed automatically after a number of days.
Can automation manage system alerts?
Gradually, many companies are incorporating automation in the field of managing system alerts. The age-old technology of monitoring system for both, internal and external alerts is not effective in streamlining the actual process of managing these incoming alerts. Here, IT process automation (ITPA) can take incident management to a whole new level. Automation in collaboration with monitoring tools can identify, analyze and finally prioritize incoming alerts while sending notification to fix the issue. Such notifications can be customized depending on the selected mode of preference. Also, it is worth mentioning here that automated workflows can be created to open, update and close tickets in the service desk, minimizing human intervention while electronically resolving issues.
Integration of a monitoring system with automation
Automation of system alerts happen with the following workflow. It highly improved the incident management system, reducing human intervention and refining the quality of monitoring.
The monitoring system detects an incident within the IT infrastructure and triggers an alert.
The alert is addressed by automation software and a trouble ticket is generated thereafter in service desk.
Then the affected lot is notified via preferred method of communication.
Network admin is then notified by ITPA to address the issue and recover.
The service ticket is accordingly updated through implementation of automation.
Benefits of automation to manage system alerts
Relying on a process that is manually performed especially, while dealing with critical information in a workflow can be difficult. In such a scenario, automation of monitoring critical data in business systems like accounting, CRM, ERP or warehousing can improve on consistency. It can also recognize significant or critical data changes immediately triggering notification for the same. With this 360-degree visibility of critical information, decision making can happen a lot faster which in the long run can forestall serious crisis. It also improves the overall performance of the company and customer service and reduces financial risk due to anomalies and security threats. Hence, it can be aptly mentioned that automation of system alerts can effectively reduce response and resolution time. It can also lessen system downtime and improve MTTR.
BPA platform’s role to manage system alerts
The business process automation (BPA) platform enables multi-recipient capabilities so that notification can be sent to employees across different verticals. This will increase their visibility on real-time information that is relevant to their organizational role. This platform also provides escalation capabilities where notification will be sent to higher management if an alert is not addressed on time.
For large-scale organizations, the number of alerts spotted by detection tools are growing in number with time. This inspired IT enterprises to automate security control configurations and implement responsive security analysis tasks. Through automation of security control and processes, a new firewall rule can be automatically created or deleted based on alerts. Once a threat is detected, automated response is created. We can conclude that automation can manage system alerts efficiently and effectively. And a pre-built workflow often helps to jump-start an automation process of addressing a system alert.
There has been a gigantic growth of AIOps in the last two years. It has successfully transitioned from an emergent category to an inevitability. Companies adopted AIOps to automate and improve IT operations by applying big data and machine learning (ML). Adoption of such technologies compelled IT operations to adapt a multi-cloud infrastructure. According to Infoholic Research, the AIOps market is expected to grow at a CAGR of 33.08% during the forecast period 2018–2024.
What is AIOps?
AIOps broadly stands for Artificial Intelligence for IT Operations. With a combination of big data and ML, AIOps platform improvises IT operations and also replaces certain tasks including tracking availability, event correlation, performance monitoring, IT service management and automation. Most of these technologies are well-defined and matured.
AIOps data originates from log files, metrics, monitoring tools, helpdesk ticketing and other sources. It sorts, manages and assimilates these data to provide insight in problem areas. The goal of AIOps is to analyze data and discover patterns that can predict potential incidents in future.
Focus areas of AIOps
AIOps helps with open data access without letting organizational silos play a part in it.
AIOps upgrades data handling ability which also impacted on the scope of data analysis.
It has a unique ability to stay aligned to organizational goals.
AIOps increases the scope of risk prediction.
It also reduces response time.
Impact of AI in IT operations
Capacity planning: AIOps can support in understanding workloads and plan configuration appropriately without allowing a scope for speculation.
Resource utilization: AIOps allows predictive scaling where auto-scale feature of cloud IaaS can adjust itself based on historical data.
Storage: AIOps helps in storage activity through disk calibration, reconfiguration and allocation of new storage volumes.
Anomalydetection: It can detect anomalies and critical issues faster with accuracy more than humans, reducing potential threats and system downtime.
Threatmanagement: It helps to analyze breaches in both internal and external environments.
Root-causeanalysis: AIOps is effective in root-cause analysis, through which it reduces response time and creates remedy after locating the issue.
Forecastingoutages: Outage prediction is essential for the growth of IT operations. Infact, the market of forecasting outages through AIOps, is expected to grow from $493.7 to $1.14 billion between 2016 and 2021 based on industry reports.
Futureinnovation: AIOps has played a key role in automating a major chunk of IT operations in a massive way. It frees resources to focus on crucial things aligned to strategy and organizational goals.
Problems AIOps solved
The common issues AIOps solves to enable IT operations’ adoption of digitization are as follows:
It has the ability to gain access over large data sets across environments while maintaining data reliability for comprehensive analysis.
It simplifies data analysis through automation empowered by ML.
Through accurate prediction mechanism, it can avoid costly downtime and improve customer satisfaction.
Through implementation of automation, manual tasks can be eliminated.
AIOps can improve teamwork and workflow activities between IT groups and other business units.
Peeping into the future
AIOps platform acts as a foundation stone in projecting future endeavors of organizations. It uses real-time analysis of data to provide insights to impact business decisions. Successful implementation of AIOps depends on key parameters index (KPIs). It can also deliver a predictive and proactive IT operation by reducing failure, detection, resolution and investigation.
Artificial Intelligence in IT operations (AIOps) is rapidly pacing up with digital transformation. Over the years, there has been a paradigm shift of enterprise application and IT infrastructure. With a mindset to enhance flexibility and agility of business processes, organizations are readily adopting cloud platforms to provision their on-premise software. Implementation of technologies like AIOps and hybrid environment has facilitated organizations to gauge the operational challenges and reduced their operational costs considerably. It helps enterprises in:
Infact, if we look at Gartner’s prediction, by 2022, 40% of medium and large-scale enterprises will adopt artificial intelligence (AI) to increase IT productivity.
AIOps Market forecast
According to Infoholic Research, the AIOps market is expected to reach approximately $14 billion by 2024, growing at a CAGR of 33.08% between 2018–2024. The companies that will provide AIOps solutions to enhance IT operations management in 2019 include BMC Software, IBM, GAVS Technologies, Splunk, Fix Stream, Loom System and Micro Focus. By end of 2019, US alone is expected to contribute over 30% of growth in AIOps and it will also help the global IT industry reach over $5,000 billion by the end of this year. Research conducted by Infoholic also confirmed that AIOps has been implemented by 60% of the organizations to reduce noise alerts and identify real-time root cause analysis.
Changes initiated by enterprises to adopt AIOps
2019 will be the year to reveal the true value of AIOps through its applications. By now, organizations have realized that context and efficient integrations with existing systems are essential to successfully implement AIOps.
Since AIOps need to operate on a large amount of data, it is essential that enterprises absorb data from reliable and disparate sources which, then, can be contextualized for use in AI and ML applications. For this process to work seamlessly, data must be stored in modern data lakes so that it can be free from traditional silos.
Maintaining data accuracy is a constant struggle and in order to overcome such complexity, in 2019, there will be technology partnership between companies to deal with customer demands for better application program interface (APIs).
Automation of menial tasks
Organizations are trying to automate menial tasks to increase agility by freeing up resources. Through automation, organizations can explore a wide range of opportunities in AIOps that will increase their efficiency.
Streamling of people, process and tools
Although multi-cloud solutions provide flexibility and cost-efficiency, however, without proper tools to monitor, it can be challenging to manage them. Hence, enterprises are trying to streamline their people, process and tools to create a single, siloed-free overview to benefit from AIOps.
Use of real-time data
Enterprises are trying to ingest and use real-time data for event correlation and immediate anomaly detection since, with the current industrial pace, old data is useless to the market.
Usage of self-discovery tools
Organizations are trying to induce self-discovery tools in order to overcome the challenge of lack of data scientists in the market or IT personnel with coding skills to monitor the process. The self-discovery tools can operate without human intervention.
Between 2018 to 2024, the global AIOps market value of real time analytics and application performance management is expected to grow at a rapid pace. Also, it is observed that currently only 5% of large IT firms have adopted AIOps platforms due to lack of knowledge and assumption about the cost-effectiveness. However, this percentage is expected to reach 40% by 2022. Companies like CA Technologies, GAVS Technologies, Loom Systems and ScienceLogic has designed tools to simplify AIOps deployment and it is anticipated that over the next three years, there will be sizable progress in the AIOps market.
“There is no reason and no way that a human mind can keep up with an artificial intelligence machine by 2035,” stated Gray Scott. Cognitive automation is a subcategory of artificial intelligence (AI) technologies that imitates human behavior. Combined efforts of robotic process automation (RPA) and cognitive technologies such as natural language processing, image processing, pattern recognition and speech recognition has eased the automation process replacing humans. The best part of CA solutions are, they are pre-trained to automate certain business processes hence, they don’t need intervention of data scientists and specific models to operate on. Infact, a cognitive system can make more connection in a system without supervision using new structured and unstructured data.
Future of CA
There is a speedy evolution of CA with increasing investments in cognitive applications and software platforms. Market research indicates, approximately $2.5 billion has been invested in cognitive-related IT and business services. There is also an expectation of 70% rise in such investments by 2023. The focus areas where CA gained momentum are:
Quality checks and system recommendations
Diagnosis and treatment recommendations
Customer service automation
Automated threat detection and prevention
Fraud analysis and investigation
Difference between normal automation and CA
There is a basic difference between normal IT automation and CA technologies. Let’s try to understand it with a use case where a customer while filling an e-form to open an account in a bank, leaves few sections blank. A normal IT automation will detect it, flag it red and reject the form as incomplete. This then, will need human intervention to fix the issue. CA, in a similar situation, will auto-correct the issue without any human intervention. This will increase operational efficiency, reduce time and effort of the process and improve customer satisfaction.
Enterprises’ need for CA
As rightly mentioned by McKinsey, 45% of human intervention in IT enterprises can be replaced by automation. Tasks with high volumes of data requires more time to complete. CA can prove worthy in such situations and reshape processes in an efficient way. Businesses are becoming complex with time, and enterprises face a lot of challenges daily like; ensuring customer satisfaction, guaranteeing compliance, staying in competition, increasing efficiency and decision making. CA helps to take care of those challenges in an all-encompassing manner. CA can improve efficiency to the extent of 30 – 60% in email management and quote processing. It ensures an overall improvement in operational scalability, compliance and quality of business. It reduces TAT and error rates, thus impacting enterprises positively.
Benefits of CA in general
A collaboration between RPA and CA has multiplied the scope of enterprises to operate successfully and reap benefits to the extent that enterprises are able to achieve ROI of up to 300% in few months’ time, research reveals. The benefits enterprises can enjoy by adopting CA are:
It improves quality by reducing downtime and improving smart insights.
It improves work efficiency and enhances productivity with pattern identification and automation.
Cognitive computing and autonomous learning can reduce operational cost.
A faster processing speed can impact business performance and increases job satisfaction resulting employee retention, since it boosts employee satisfaction and engagement.
It increases business agility and innovation with provisioning of automation.
As a part of CA, Natural Language Processor (NLP) is a tool used in cognitive computing. It has the capacity to communicate more effectively and resolve critical incidents. This increases customer satisfaction to a great extent.
Enterprises using CA for their benefit:
A leading IT giant combined cloud automation service with cognition to reduce 50% of server downtime in last two years. It also reduced TAT through auto resolution of more than 1500 server tickets every month. There was reduction of critical incidents by 89% within six months of cognitive collaboration.
An American technology giant introduced a virtual assistant as one of their cognitive tools. It could understand twenty-two languages and could handle service requests without human intervention. It eased the process of examining insurance policies for clients, help customers open bank accounts, help employees learn company policies and guidelines.
A leading train service in UK used virtual assistant starting from refund process to handling their customer queries and complaints.
A software company in USA uses cognitive computing technology to provide real-time investment recommendations.
Cognitive computing technology used in media and entertainment industries can extract information related to user’s age, gender, company logo, certain personalities and locate profile and additional information using Media Asset Management Systems. This helps in answering queries, adding a hint of emotion and understanding while dealing with a customer.
Secondary research reveals that the Cognitive Robotic Process Automation (CRPA) market will witness a CAGR of 60.9% during 2017 – 2026. The impact CA has on enterprises is remarkable and it is an important step towards the cognitive journey. CA can continuously learn and initiate optimization in a managed, secured and reliable way to leverage operational data and fetch actionable insights. Hence, we can conclude that enterprises are best poised to gain considerably from cognitive automation.
Based on research, cybercrime is the biggest threat to every company globally, and the biggest challenge that humanity will face in the coming decades. Looking at the voluminous increase in cybercrime, it is evident that cyber-security is paramount to computers, process and people. Research indicates that the estimated cost of cyber-security will reach $6 trillion annually by 2021. As per the annual cybercrime report, it is observed that cybercrime will have a huge impact, with hacking activities by organized crime gang by 2021. Cyber security on medical devices alone can cost $65 billion by 2021. Growing DDoS attacks, zero-day exploits, kill chains and ransomware are transforming last year’s prediction into reality.
Understanding the concept of cyber security
Cyber security is the course of recuperating and shielding a device, network or program from any type of cyber-attack which poses as a threat to organizations, employees as well as consumers. Cyber-attacks target sensitive information and misuse the same or destroy it thus impacting business. A robust cyber security system depends on a technology with multiple protection layers and users with smart cyber defense choices. A cybercrime cost includes the following:
Damage and destruction of data
Financial data theft
Intellectual property theft
Cybercrimes affecting the world economy
The Yahoo hack heavily affected approximately 3 billion user accounts
The Equifax breach affected grossly 145.5 million customers
Average cost of data breach in US is $ 7.91 million annually
Areas of vulnerability
The new software codes produced each year are quite vulnerable to attacks. Research suggests, the global digital content will grow from 4 zettabytes in 2019 to 96 zettabytes by 2020, snowballing the probabilities of cyber-attacks. Deep web is 5000 times larger than a surface web; a portion of deep web is known as dark web which cannot be accessed by search engines. It is concealed to promote cybercrimes. It is predicted that every 14 seconds a business will fall victim to a ransomware attack by end of 2019. According to a leading IT company’s global incident response and recovery team, it is impossible to monitor such automated cybercrime like; ransomware manually, as cyber-security demands focus and dedication. Implantable medical devices (IMDs) like cardioverter defibrillators (ICD), deep brain neurostimulators, pacemakers, insulin pumps and so on are hackable. In the manufacturing industry, compromises like crypto locker have caused a lot of damage. Again, KRACK Attack has amplified the cyber risk of wireless routers. Education is another domain which lacks cyber security and remains vulnerable to attacks. Starting from online applications to endpoint security and patching cadence, hackers can easily take advantage of the vulnerabilities of educational institutions.
Need for cyber security plan
The transformation from a potential threat to an actual one leaves no choice for enterprises but to plan their cyber security. Cybercrime is no more a hobby, it’s a serious profession.
Perpetrators are not amateur hackers but, professionals with more experience than average IT employees.
The threat involved is no more a simple disruption, there is a complex strategy involved along with anti-national sentiments.
There is a need for proactive defense rather than reactive defense.
A cyber strategy offers accountability to business stakeholders.
Genuine need for cyber security
Cyber-attack happens in three major categories; confidentiality, availability and integrity. The first one deals with identity theft and misuse of bank account or credit card information. The second one aims at blocking access to one’s own data and information for a ransom. The third type of attack can be both, on an individual and an enterprise where sensitive information is intercepted to reduce credibility. In order to stay protected against cyber-attacks, files should be frequently backed up. People should be educated so that they don’t open links or attachments from unknown senders. Devices should be kept updated. Only https://URLs should be trusted.
Looking at the future – need for cyber security
It is no longer a question whether cyber security needs to be implemented or not, it is a mandate to protect customer’s data and fight cyber-crime. Infact, cybercrime is creating unparalleled damage to both, public and private organizations, demanding high(?) security budget. It is essential to understand that malware is easily accessible by anyone who plans to become a cyber attacker. Looking at the market predictions, global expense on cybersecurity is expected to exceed $1 trillion cumulatively over the next five years, so, we can anticipate a 12 to 15% increase in cyber security market by 2021. As per Palo Alto Networks Research Center, by the end of this year the demand for cyber-security professionals will increase to 6 million globally. Training employees to detect a threat and defend it is imperative, so, enterprises are expected to invest $10 billion by 2027. Infact, a study confirmed the fact that 90% of cybercrimes happen from phishing and emails luring their recipient. If employees are trained to deal with such emails, a secured environment can be created. A market research also forecasted that approximately 20 million connected cars will have built-in software-based security technology by 2020. The U.S. Bureau of Labor Statistics report suggests that by 2022, there will be an increase in information security analysts by 37%.
Looking at the pace in which our world is moving towards digitization, one has to admit, that network analytics will play an important part in paving the way how IT would operate in future. Network analytics for an enterprise is complex, the AI and automation technologies in use help achieve intelligent and effective ways towards future IT operations.
Network analytics improves user experience in IT operations by analyzing network data. It compares and correlates data to address a problem or trend. It manages IT operations by channelizing the below mentioned data inputs.
Real network traffic generated by client.
Synthetic network traffic created by virtual clients.
Metrics from infrastructure.
Application program interface (API) from application server.
Scope of network analytics
A user can face poor network performance or disruption in service due to either, OS problem, Wi-Fi or LAN issue, DHCP, WAN problem or application failure. To locate the actual cause for interruption is essential for smooth functioning of IT operations. Network analytics operate with the help of big data analytics along with cloud computing and machine learning to examine data and create a holistic perspective. Proactive IT Operations Led by predictive insights enhance 90% data accuracy. It can also interpret data in a visual format to develop an elaborate understanding. Here, network analytics plays an important role in redefining IT operations.
Network analytics uses proactive analytical tools such as; Sisense, Azure, R Open, GoodData etc. for a deeper understanding of issues and to locate the source of error which can make IT operation seamless. Sisense helps processing data 10 times faster, Azure’s 100 modules per experiment or 10 GB storage space is cost effective. GoodData allows 360-degree overview for customer insights.
Earlier, the task to fix a network issue was relatively simple, now, with the increasing usage of virtual and mobile devices and cloud computing, detecting an issue within a network and fixing the same has become complex. Without network analytics, IT Ops will not be able to sustain the wrath of disruption.
There has been huge diversification lately in the field of hardware, operating systems, application and services. Understanding network problems within these landscapes, can be challenging. Network analytics plays an important role here by easing the task through user performance management (UPM).
Network analytics also minimizes the issue with access network in IT operations starting from getting Wi-Fi access to authentications, obtaining IP addresses or resolving DHCP requests.
Network analytics tool can help reduce network traffic through alteration in facilities. It can use network event correlation to understand the impact on devices and customer’s experience on bandwidth latency.
Network analytics assists a great deal in network capacity planning and deployment opportunity for an improved network ROI by up to 15% as per market research.
Difference between monitoring and analytics network solution
To analyze the impact network analytics has on IT Ops, it is essential to understand the difference between monitoring and analytics solution. Monitoring refers to collecting and interpreting data in a passive form and sharing potentially actionable information to the network manager. Hence, it focuses on spotting problems without fixing them.
Analytics is more prescriptive where, recorded historical data is understood, learnt and analyzed paving a pattern to be followed. Data collected from Wi-Fi, devices, applications and WAN create trends that impact IT operations.
Along with pinpointing the area of concern, advanced analytics tries to automate new solutions to the detected problem. Advanced network analytics help to understand if the issue is with a client operating system, application, network services or Wi-Fi access. This enhances the scope of IT Ops by improving infrastructure by providing insights to take the overall operations to the next level. The new generation of network analytic tools and solutions can reduce outages, upgrade systems and applications, improve customer experience and simplify the process of operations in IT.
Benefits of network analytics in IT ops
Network analytics can help IT Ops analyze the requirement and create a balance so that, the available resources can be optimally utilized to enhance network performance and lower the cost structure of IT Ops.
Network analytics help with data mining insights for identification of revenue and enabling a data-driven and action-oriented IT operation.
Network analytics can help in capacity planning where both resources and services can be calculated in advance for an apt provisioning.
Impact of network analytics in brief
Network analytics, with its analytics tool, can predict future down time, allowing necessary action to be taken on time. It also increases awareness of the root cause of the problem to remediate faster and eventually prevent and result in reducing MTTR by 95%. This can reduce organizational disruption and operational costs while increasing customer satisfaction.
Angular is a great framework which is well suited for developing large app built to get the highest performance on the web. But sometimes as a developer we end up doing things which result in poorly performing app. This blog is mainly about Angular specific best practices to have best load time and runtime performance.
Load Time Performance
Ahead Of Time (AOT) Compilation: On the contrary to JIT (Just In Time) Compilation where the compilation is done in the browser, AOT also called offline compilation which compiles the code during the build process thus reducing much of the processing overhead on the client browser. With your angular-cli just specify the “aot” flag (if prod flag is present, then aot flag not required) and AOT will be enabled.
Tree-shaking: This is the process of removing unused code thus resulting in smaller build size. On your angular-cli, Tree-Shaking is enabled by default.
Uglify: In this process the code size is reduced using various code transformations like mangling, removal of white spaces, removal of comments etc. For angular-cli specify the “prod” flag to perform the uglification process.
Prod flag: For production, build specify the “prod” flag in the angular-cli application. It will enable various build optimizations like, AOT, uglify, removal of source maps, service workers (if enabled) producing a much smaller build size.
Build-optimizer flag: If you are using angular-cli make sure you specify “build-optimizer” flag for your production build. It will disable the vendor chunk and will result in more smaller code.
Lazy loading: Lazy loading is the mechanism where instead of loading complete app, we load only the modules which are required at the moment thereby reducing the initial load time. In simple words, it doesn’t load something which you don’t need.
Updating Angular and angular-cli: Updating your Angular and angular-cli regularly gives you the benefit of many performance optimizations, bug fixes, new features, security etc.
RxJS 6: RxJS 6 makes the whole library more tree-shakable thereby reducing the final bundle size. RxJS is a library for reactive programming which uses Observables, to compose asynchronous or callback-based code.
Updating Third Party Packages: Make sure you are regularly updating your third party packages. Many of newer packages may contain many performance improvements including smaller size and other build time performance optimizations (e.g. RxJS 6). Also by updating the packages regularly, you may get many improvements related to the bug fixes, security vulnerability fixes, fixes related to package compatibility etc.
Compressing images: It’s a good idea to compress the images without losing much of the quality thereby saving the bytes transferred over the network improving the build time. There are many tools available to achieve this. Visual Studio Code extension called TinyPNG can be used to compress Jpeg and PNG images without losing much of the quality.
Remove unused fonts: It’s a good idea to remove the unused fonts which may help you save few bytes over the network.
Slow DNS and SSL: Sometimes your DNS and SSL provider could be the reason for slow load time. So make sure the DNS and SSL are fast and configured properly.
Run Time Performance
Change Detection: By default on each asynchronous event, Angular does a dirty checking by performing a change detection for the whole component tree. Such dirty checking could be a lot computational heavy for a medium to large apps. You can drastically reduce this by setting “ChangeDetectionStrategy” to “OnPush”. Thus By setting the “onPush”change detection strategy we are signing a contract with Angular that obliges us to work with immutable objects.
Detach Change Detector: We can completely detach the component from change detection thereby giving a developer the control to inform Angular as to when and where to perform the change detection.
trackBy: Manipulating the DOM is an expensive task, and this can be very evident when it comes to rendering long lists of items, usually achieved by using the *ngFordirective. By default, *ngFor identifies object uniqueness by reference. If the object reference is broken while updating the content of the object, Angular removes the related DOM node completely and recreate it again even though the actual change required is for only a small part of the DOM node. This issue can be solved by using trackBy.
Pure Pipes: In the “@Pipe” decorator you can specify “pure” flag as true. This flag indicates that the pipe is not dependent on any global state and is side effect free. It enables Angular to cache the outputs for all the input parameters the pipe has been invoked with and thus allows to reuse the values instead of recomputation. This can lead to massive reduction in the duplicate operations performed in many cases thus hugely improving the performance.
Avoid complex computations in the template: Avoid doing complex calculation in the HTML template (ex. calling the component method inside the template), instead leverage the use of pure pipes which takes the advantage of Angular caching and hence avoiding duplicate operations. If the use of pipe is not possible, we can pre-calculate the values and then directly bind values instead of calling the component method in the template.
Unsubscribing Observables: Observables can create memory leak issue. Hence it is better to unsubscribe them when they are not needed anymore. However, you don’t have to unsubscribe all observables used. Unsubscribing explicitly is only required when a subscription is created inside a component which is destroyed before the observable completes.
Observable share() operator:If you have subscribed the observable at multiple locations/components, then each subscription will try to produce the data even though the data is duplicate. We can avoid the processing of duplicate data across subscriptions using the “share()” operator.
Have you ever wondered about your CEO’s hobby? Ask him or her next time you meet. Don’t be shy and ask executives in your organization what they like to do in their free time. Don’t be surprised if you hear about sailing, dog sledding, mountaineering and rock climbing, marathon running, hiking and paddling. Seems weird, doesn’t it? We probably imagine CEOs always in meetings or at their desks, studying reports, thinking about strategy and making difficult decisions. How do extreme sports and outdoors activities fit into this picture?
I pondered this question until last year I joined a local hiking and backpacking group. A year of hiking, hundreds of miles in the woods and about 40 nights under the stars taught me the most important lesson of all: to find a solution to a problem you need to disconnect from the problem. And the best way to relax your mind, reflect and meditate is through physical activity, preferably far from the civilization, traffic and cell reception.
Here’s the list of seven things about being outdoors that changed my life.
Motion and outdoors unlock creativity and innovation. My first backpacking trip took me to the high ridges of the Smoky mountains along the border of Tennessee and North Carolina. One day I was hiking alone on a long stretch of the Appalachian Trail, thinking about two things that bothered me for months. I was alone on the trail, high on the ridge, with a 30 pound pack on my back. Usually thinking about the two problems had felt like circling the drain or having a black hole in my chest, so I tried to get them out of my mind and simply put one foot in front of the other, breathe in and out, be a part of a beautiful fall day. Then suddenly it hit me. In one moment, I knew the answer to both of my issues, it just came to me out of the blue… Later I had more moments like this, always in the woods, hiking alone, not really thinking about anything in particular. A few ideas that I had on the trail changed my life, pivoted my career and gave me a new purpose. Nothing like that ever happened at my desk, in front of the computer or a TV screen.
Trying new things. Being outdoors teaches you not to be afraid of the unknown. You kind of get addicted to unknown – new places, new trails, new gear, new activities. It’s hard to count all things that I did this year for the first time in my life: camped on a beach, built a fire, paddled in a kayak on a river in high water, used a pit toilet. I went to several national parks and countless state parks and nature preserves, and witnessed beauty that made my heart melt. During the same period of time I also initiated ambitious projects, got a new job and started learning Spanish.
Taking risks. There’s blind fear of the unknown and there’s risk analysis. The first time I had to climb a boulder I was terrified. I didn’t trust my body to push itself up the slick surface, didn’t know how to stick my foot in a crevice or pull myself up with my hands and arms. What if I fall? How dangerous is the drop?… The woods are full of terrors. Bears, snakes, yellow jackets. Slippery rocks, rushing rivers, cold rain and harsh sun, water sources contaminated with bacteria, screeching sounds in the dark. Eventually you learn to curb your fears, understand the risks and prepare for emergencies. Hikers say: there’s no such thing as bad weather, just bad gear. So you have the right gear, you learn about your local flora and fauna, you watch out for snakes and bears, and then when you see them, it is something you brag about to your hiking friends. Being in the woods taught me to be careful but also enjoy the adrenaline rush. At work I ask myself “what is the worst that can happen to this project?”, and then do whatever it takes to prevent the worst case scenario from happening, while still pushing forward to a goal.
Achieving goals. Reaching the top, finishing that 12 mile loop in the dark in cold rain, bushwhacking to that waterfall, doing something you would never imagine you could do. The experience of accomplishing something that seemed impossible, is priceless. It makes you confident, helps to believe in yourself and trust your instincts.
Solitude. Outdoors can be a place to be quiet and alone, thinking your thoughts or not thinking at all. Sometimes I get into a zone similar to meditation in which I allow my mind to wander while still staying in the moment and being alert to sounds, smells and views. “Hike your own hike” is a motto of the utmost freedom and independence. Being on your own, even for a couple of hours during the day, allows to get back to who you really are and know what you really want, find out what your own “hike” looks like.
Meeting new people. But outdoors is also where you can meet all sorts of people. Friends and passerby, hikers and tourists. You share stories, food and camp fire warmth. You tell your life story and learn about others. You let people help you, teach you and be your mentors, and then in turn, you do it to others. Hiking taught me humility, infinite respect to other people, their lives, their challenges and their strengths, as well as patience and acceptance of their weaknesses and flaws. I am inspired by people I met on the trail, especially some badass women who became my mentors and my personal heroes.
Putting things in perspective. The world is huge. Being on the trail and moving with the walking speed of 2 miles per hour brings us closer to our real place in the world. We are much smaller then some things like trees or mountains, but also bigger and stronger then most of the living creatures. Seeing the immense infinite beauty from up close puts our problems, emotions, struggles, ambitions, dreams and plans into the right perspective too. They don’t disappear or become insignificant, but you get a more accurate sense of their importance by comparing to other people or other problems.
I am not a CEO, but being outside has empowered me, helped me focus, make important decisions and come up with creative ideas. I am stronger, and I have much fewer fears. I am confident in myself and know that I can do anything, and anything is possible.
So I think now I understand why a CEO would get into sailing. It’s not about spending a ton of money on expensive fancy sailboats. It’s about risk and adrenaline, challenge and achievement, planning and learning. It’s about strength and empowerment, tranquility and relaxation, beauty, and wonder, and awe.
In our fast-paced world we are looking at everything supreme and uber. We demand so much of technology and that sometimes mean, with the rapid speed technology keeps to match our pacing needs, there is very little time to conduct researches on how it can affect humanity.
5G is to be the brilliant answer to connectivity, with high data rate, reduced latency, energy saving, reduced costs and improved system capacity. With its ‘Release – 16’ in April 2020, it is due for submission to the International Telecommunication Union (ITU) as a candidate of IMT-2020 technology.
Technologists are excited about 5G, and businesses surveyed believe that they expect 5G to provide improved, never like before customer experience. And, about 47% believe it is an opportunity for more creative formats. On the consumers’ side, the survey proves that majority of them are indeed very excited as it is expected to provide hugely improved, speedier connectivity for wearables. (“Data from the Consumer Technology Association (CTA) from July 2018 shows that wearables are increasingly becoming part of everyday life. Three in 10 US consumers own wireless earbuds, 25% use wearable fitness trackers and 18% own a smartwatch. Even VR headsets are making their way into US households, at 11% penetration.”) And about 85% of consumers believe that 5G will benefit AR and AI powered work and experiences.
So, with all that excitement about the advancements that 5G would bring us, there is also plenty of news that the technology is not tested enough. As much as we love technology, it is our responsibility to research it well enough for potential damages and harm it can cause us.
At any point in time, our phones are constantly sending magnetic waves whether we are receiving any calls or notifications or not. And, 5G that would be out with upto 60 GHz, and the numerous towers than earlier (because it’s frequency can only travel so much), is going to cause a surge of magnetic waves, right? Apparently, in 5G technology, energy will be deposited in the skin; it means that our skin will do all absorption, BUT they do not know how RFCMS is going to affect human skin. A group of scientists, doctors and concerned citizens recently got together to call for an urgent stop deployment of 5G, reports Michele Greenstein. They called it ‘an experiment to humanity’, and claimed that it should be seen as ‘criminal’ under international law..
The economy boost from it is expected to be a $500 billion with 3 Million new jobs, but something about those small cell towers cropping all over the place to support 5G makes me wary, especially when I read that we don’t fully understand the extent of their effects on our health.
Is 5G dangerous to our health is a question that is getting asked a lot these days, and having combed thru the web, I am getting a lot of conflicting answers;
What happens when we approach a stage when the wavelength of energy reaches the dimensions of a biological structure? Paul Ben-Ishai, PhD, Department of Physics, Ariel, Israel Institute for Advanced Studies at Hebrew University, Jerusalem, analyzed the reaction of human skin to 3G energy waves – in their experiment they measured the hands of people as they sat calmly. Then these people were made to run around the campus in order to increase their stress levels, and to perspire a lot. They then dried their hands and measured them again. What they found was that the sweat gland can absorb electro magnetic energy between 75 – 110 gigs; by making the sweat gland work more the absorption could also change. See what that could mean with the energy waves from 5G?
Dariusz Lesszczynski, Msc., DSc., PhD, Chief Editor of Radiation and Health, Adjunct Professor, University of Helsinki, Finland, says – What we do not have is the slightest idea how humans respond to RFCMS. In 5G technology, energy will be deposited in the skin, meaning that skin will take all absorption, and not really all the other organs will be affected, unless indirectly from responses from within the skin. Apparently, they do not know how human skin is going to react to RFCMS. Our sweat ducts are a strong absorber of 5G radiation as this professor points out.
There is a very urgent need to evaluate health effects of 5G, including skin physiology. He recommends that there is a need to evaluate if 5G exposure increases risk of melanomas and / or epithelioma. Up until now we know that our sweat ducts are very strong absorbers of the 5G radiation, but there has been very little studies on the health effects of 5G radiation. But as of now they don’t know one way or the other if this will pose a serious health risk, just that the research in its long-term effect is not yet concrete.
Back in 2011, the International Agency for Research on Cancer did a press release where they classified ‘radiofrequency electromagnetic fields as a possibly carcinogenic to humans, based on an increased risk for glioma, a malignant type of brain cancer associated with wireless phone use’ (Press release No. 208, dated 31 May 2011). IARC Director, Christopher Wild, said “given the potential consequences for public health of this classification and findings, it is important that additional search be conducted into the long-term heavy use of mobile phones….”
Early 2018, SB Wire reported “the world’s largest animal study on cell tower radiation confirms cancer link – scientists call on the world health organization international agency for research on cancer to re-evaluate the carcinogenicity of cell phone radiation after the Ramazzini Institute (Italy) and US Government studies report finding the unusual cancers.
“Teton Village, WY (SBWIRE) – Researchers with the renowned Ramazzini Institute (RI) in Italy announce that a large-scale lifetime study of lab animals exposed to environmental levels of cell tower radiation developed cancer. A $25million study of much higher levels of cell phone radiofrequency (RF) radiation, from the US National Toxicology Program (NTP), has also reported finding the same unusual cancer called Schwannoma of the heart in male rats treated at the highest dose. In addition, the RI study of cell tower radiation also found increases in malignant brain (glial)tumors in female rats and precancerous conditions including Schwann cells hyperplasia in both male and female rats.”
“Such findings of effects at very low levels are not unexpected” stated Devra Davis PhD, MPH, President of EHT, pointing to a Jacobs University replication animal study published in 2015 that also found very low levels of RFR promoted tumor growth. “This study confirms an ever-growing literature and provides a wake-up call to governments to enact protective policy to limit exposures to the public and to the private sector to make safe radiation-free technology available”
On the other hand, the UK Health Protection Agency (part of the Public Health England) says, ‘in view of the widespread use of wi-fi in schools, we have conducted the largest and the most comprehensive measurement studies to assess exposures of children to radiofrequency electromagnetic fields from wireless computer networks. This agency concluded that radiofrequency exposures were well below recommended maximum levels and that there was “no reason why wi-fi should not continue to be used in schools and in other places.”
5G would usher in IoThings, everything from super smart homes, remote surgeries, autonomous vehicles will all be achieved with this instantaneous speed, but 5G isn’t going to come cheap even if they talk about cost reduction, which is what we common consumers expect. Though the technology can bring latency down from 50 milliseconds to less than 1 millisecond, which would mean we get to download an entire season of game of thrones in HD, in 30 seconds, for AT & T to go live in 12 cities on December 18, 2018, it took a whopping amount. Though it is not fully operational, AT & T would charge users $70 per month for 15 gigabytes.
Senate Bill 637 – Veteran MD Sharon Goldberg, an internal medicine physician, dropped a bombshell about 5G tech dangers recently, she said “wiles radiation has biological effects. The PubMed and peer view literature states ‘these effects are seen in all life forms – plants, animals, insects, microbes. In humans we have clear evidence of cancer now, there is no question. We have evidence of DNA damage, cardio myopathy which is the precursor of congestive heart failure, neuro psychiatric effects. 5G is not a conversation whether or not these risks exists, they clearly do. She says ‘5G is a conversation about unsustainable healthcare expenditures. We have been sitting on the evidence for EMR and chronic disease for decades, and now we are seeing all these epidemics appearing. Diabetes is the first epidemic, the statistics are very scary, one in three American children will become diabetic in their lifetime, and for Hispanic females, the number is 1 in 2. What does all this have to do with wireless radiation? Wireless radiation in other electro magnetic fields such as magnetic fields and electricity have been clearly associated with elevated blood sugar and diabetes.
Dr. Sharon Goldberg goes on to say that the closer you live near a cell tower, the higher your blood glucose. That is based on hemoglobin A1 C measurement. So, the idea of small networks and putting them close to people’s homes scientifically is very dangerous. She reveals that ‘the way you create a model of diabetes in rats in the lab is by exposing them to 2.4 gigahertz and this is not for long term exposure. The result of the diabetes epidemic leaving costs aside, it affects the kidneys, and causes chronic kidney disease. End stage renal disease is the worst complication of diabetes that leads to hemodialysis.
Hemodialysis is an automatic qualification for Medicare, it is reported that renal failure takes care of 7% of all Medicare expenditure in the United States.
Fear mongering or no, what I find is that researchers are not fully sure and that is the reason to make me a little paranoid about the lack of definitive findings. At all the breakneck speed technology is advancing, we don’t find the time for thorough research to take place, and we are now having to worry about the magnetic pollution that comes off these high frequency waves. In comparison, the laser that is shooting thru the fiber optical cables seems to be a much safer alternative with no pollution or potential for harm.
With over 5.7 million views of Simon Sinek’s iconic YouTube video, “Start with Why – How Great Leaders Inspire Action,” there has become a louder conversation and more emphasis placed on the concept of understanding one’s “why” in the nine years since the video first posted.
Many have personal reasons for diving deep into the question of “What is my why?” oftentimes trying to discover their true personal passion. On a larger scale, progressive technology companies are also exploring this question in an effort to not only discover their own why as an organization, but also the why of their customers.
Understanding why your customers buy from you the first time, but even more importantly, discovering why customers continue to buy from you, is critical for long term success.
Fall in love with your customer, not your product
It’s human nature for leaders in a company to fall in love with their product. After all, they hire brilliant minds to develop these innovative and exciting products; they spend countless hours and dollars to ensure that their products are cutting edge and better than their competitors’ products; and sales teams work tirelessly to ensure that their target audiences are keenly aware of the features and functions of the product that make it distinctive and desirable.
But imagine the benefits of shifting that love from the product to the person who actually purchases the product.
The Value is in the Why
In a meeting of a Customer Advisory Board I recently facilitated, one particular CIO spoke passionately about his why, and further, why this should matter to the host company.
In his comments, he said he didn’t care so much about the features and functions of the product, but rather, how the team who developed the product can ensure that he can spend the weekend with his family uninterrupted; how this organization he counts on can ensure that he won’t get a frantic call in the middle of the night; how they can help him keep his job.
As Simon Sinek says, “Value is not determined by those who set the price. Value is determined by those who choose to pay it.”
What is the value of uninterrupted time with family or job security?
By confirming rather than assuming you know why your customers buy from you, you have the added advantage of providing the exact kind of value for which the people who purchase your products and services are willing to pay a premium.
In an article published in Inc.com by Katlin Smith, Founder and CEO of Simple Mills, she pointed to the following as reasons customers will buy from her company:
Identity – “People make purchases that fit who they are or who they aspire to be (or both).”
Value – “Don’t assume that what matters to one, matters to all.”
Experience – “It’s easy to forget that stores and products are an experience – one that many consumers enjoy.”
Connectivity/Community – “This can be very subtle, where purchasing your products simply makes the customer feel part of something larger.
Quality – “If making things easier for your customer requires you to chop away at your product, don’t do it.”
Need – “Find this in your customer. Make it a priority.”
Nowhere in this list do you see any mention of the features or functions on what Simple Mills sells.
So now you understand the importance of getting to the customer’s true reasons for purchasing the products and/or services you provide. What’s next?
The obvious answer is, you just need to ask your customers, which is 100% correct, but perhaps not as easy as it sounds.
Getting to the real reasons your customers purchase from you requires these four elements:
An insatiable curiosity – getting to the root of what drives your customers rather than taking an answer for face value. This requires the ability to ask thought-provoking and relevant questions.
Time – This is a process that can’t be rushed. It may necessitate multiple conversations. Patience will serve you well.
Trust – There must be a trusting relationship in place before you can expect a customer to open up fully in sharing their true why. (i.e. “Help me not lose my job.”)
A system – In order to fully leverage the power of getting to your customers’ why, you need an approach and a system that will help you consistently accomplish numbers 1 through 3. As noted in the example above, the powerful insights came from a Customer Advisory Board, a proven system for taking time to build trust and ask meaningful questions of the people who purchase what you sell.
The Five Whys
You may be familiar with the methodology known as “5Whys.”
5 Whys is an iterative interrogative technique used to explore the cause-and-effect relationships underlying a particular problem. The primary goal of the technique is to determine the root cause of a defect or problem by repeating the question “Why?” Each answer forms the basis of the next question. The “5” in the name derives from an anecdotal observation on the number of iterations needed to resolve the problem. (Source: Wikipedia)
I suggest that the same technique could be used for getting to the “root why.”
You: Why do you buy from us, Mr. Customer?
Mr. Customer: Well, I guess I just really like your product.
You: Why do you like our product?
Mr. Customer: Because it’s stable.
You: Why does that stability matter to you?
Mr. Customer: Because if everything’s working, my boss is happy.
You: Why does it matter that your boss is happy?
Mr. Customer: Because when she’s happy and everything is working, I get a good review.
You: Why does getting a good review matter to you?
Mr. Customer: Because it means I feel secure in keeping my job.
Imagine the power of obtaining this insight and how it might change the direction of the conversations and relationships with your customers.
Betsy Westhafer is the CEO of The Congruity Group, a Customer Advisory Board consultancy based in the US. She is also the author of the #1 Best Seller, “ProphetAbility – The Revealing Story of Why Companies Succeed, Fail, or Bounce Back, available on Amazon.