AI and its impact on app competitiveness

AI in mobile tech world

This is the era of the fourth industrial revolution where technology without artificial intelligence (AI) is unimaginable. With the global acceptance of AI, it has encompassed all spheres, touching human life in several ways that also includes the mobile tech world. Research indicates that AI is rapidly gaining popularity, tech giants like Baidu and Google have already spent between $20 to $30 billion on AI to improve IT operations. Segments like healthcare, education, finance and IT ops are investing heavily in AI, however the prominence of AI in mobile tech world deserves a special mention.

Importance of AI in mobile app

The focus of AI is to develop intelligent machines that think, work and learn from experiences like humans. When AI joined hands with machine learning, the ability to analyze visual inputs such as gesture, object, and facial recognition was made seamless. For example, an iPhone app powered by AI can enhance perception, apply reason and even solve problems.  

Deployment of AI in mobile app

AI uses the modest process of trial and error to learn about a solution when it comes to developing mobile app. Through this method, various attempts are made to locate the appropriate solution. Then that solution is stored for future usage, considering it as a reference point for similar circumstances. Along with the solution, the mobile app developers are also focusing on drawing appropriate inferences to enhance the interaction process. This helps users reach predefined solutions addressing various device problems.

Example of AI apps

The following are the existing apps that provides an enriched user experience:

  • Replika is an advanced AI app for iPhone that covers several aspects of a user’s life. This app can have conversations with the user like a real person.
  • App Airpoly can identify three objects in a single second.
  • Cortana can assess relevant information, sort them and deliver services efficiently like scheduling meetings, sending emails, tracking events, sharing updates and reminders.
  • Personal assistant like Siri became popular with its voice interface in place. It assists in phone and text actions, can provide information about weather and currency, schedule events, set reminders and provides an engaging experience.
  • My Starbucks Barista mobile app enabled customers to place their orders by mentioning it to the app.
  • Taco Bot launched by Taco Bell recommended personalized menu considering user-specific purchase trends.

Technologies empowering apps

In order to create apps empowered with AI, developers ensure they choose an appropriate platform and install features keeping the end user preferences in mind. The technologies that improve app performance and competitiveness include:

  1. Speech to text (STT) and text to speech (TTS) engine that converts voice to text message and vice versa.
  2. Tagging helps the app analyze users’ requirement.
  3. Noise reduction engine eliminates white noise improving voice command capacity.
  4. Voice biometrics and recognition works as an authentication for refining security.

Impact of AI on app competitiveness

Innovation has led end users expect better performance from mobile apps. Retail giants like eBay and Amazon have already proved the worth of AI in mobile apps. AI-enabled apps engage its user and strategically secure the brand, enhancing productivity and helps reduce errors. The algorithms present will adjust the app and forms more meaningful and context-rich prospects to keep end-users engaged. AI-aided chatbots on mobile devices use standard messaging tools and voice-activated interfaces, this reduces data collection time and simplifies the task. Also, user specific personalization will help with mundane or repeatable tasks. It even has a great impact in healthcare industry where reliability, predictability, consistency, quality and patient safety has seen improvements with the usage of AI-enabled apps.

AI in app market based on geography

The following geographical areas indicate extensive impact of AI on mobile app:

  • North America
  • South America
  • Europe
  • Asia Pacific
  • Middle East and Africa 

Conclusion

We can conclude that AI has a dramatic impact on transformation and competitiveness of mobile app. As per market research, this competition is yet to increase by 2020 since more organizations globally are investing in AI for revenue improvements and cost reductions. The deployment rates among different industry verticals have surged exponentially over the fast few years.

Smarting from too much Smart!

By Bindu Vijayan

With smart homes, we have given ourselves the distinct advantage of being able to do old tasks in new, smarter ways. Gartner reports that twenty-five billion connected devices are expected to be in use by 2021, up from the current 14.2 billion this year. Marketsandmarkets report that the smart home market would be worth $151.4 billion by 2024.

Our interactions with appliances, electronics, cars, lighting and what not, might actually be turning us over to the new enemy – complexity.  I want my blinds down before I reach home from work, my four dogs who used to go berserk with the soft purr of those blinds going down at 4 every evening have gotten used to it, and to me the ‘smart aspects’ of our everyday lives mean sheer technological nirvana, an infinite level of convenience.  In fact, it has often caught me thinking – did we give ourselves the advantage of added aesthetics when we chose connectivity and automation? If the same task of drawing the window blinds was done by someone at home, there would be that rush of impatience, a quick stab of action – whoosh and tumble of those wooden slats, very different clicks from the ‘gentle’ hum of smart blinds!

Technology is hugely impressive, we automate everything we fancy, we make businesses and then cannibalize them at the knock of a newer promise, but as I move up the age pyramid, I realize just how difficult it can get for the elderly, for those who are not tech-savvy, and even for the segment of the current digital generation who are not technical, they have to first learn how to use the technology.

When we read that Hollywood residences spend as much as $1.8M for their burglar alarm, door locks and outdoor lighting, we think, they can afford it, so why not, but when I go searching for a modest installation to smart protect my home with just the basic of burglar alarm and locks, a $3000 basic option has me smarting, but the implications are far more than just financial.

These advanced security systems for smart homes regarded as one of the major advantages run the risk of getting hacked, we are adopting connected technologies faster than we should. There isn’t enough time and effort spent on securing these technologies. Security cameras, alarm systems, motion sensors have become easy targets for hackers and so breaking in has also become a lot easier.

When we look at how it has changed lifestyle –  smart door access, smart thermostats, smart plugs to turning lights on without leaving our beds,  heating dinners, mowing the lawn, switching on our fav music – everything done without having to move around, we cant blame our children for their sedentary lifestyle and the ever looming threat of obesity and its series of cardiovascular diseases.

Talk about privacy, that seems to be the recurring smart home threat time and time again. All those video feeds from the various cameras that go into a security system is particularly vulnerable.

Here is an excerpt from a recent report, a chilling example of the despair smart technology can cause –  “Arjun and Jessica Sud routinely use a baby monitor to keep tabs on their 7-month-old’s bedroom. Last month, they heard something chilling through the monitor: A deep male voice was speaking to their child.

“Immediately I barge into the room because I’m like, ‘Oh my God, maybe someone got in there,’” said Arjun Sud, 29. “The moment I walk in, it’s quiet.”

 

The couple grabbed their son, now fully awake, and headed downstairs. When they passed their Nest thermostat, normally set around 72 degrees, they noticed it had been turned up to 90. Then, the voice was back, coming through the speaker in a downstairs security camera. And this time, it was talking to them.

The voice was rude and vulgar, using the n-word and cursing, he said. At first, he yelled back. But then, Sud composed himself and stared into the camera.

“He was like, ‘Why are you looking at me? I see you watching me,’ ” Sud said. “That’s when I started to question him back.”

The family’s Nest cameras and thermostat had been hacked. As the couple felt, all the expensive devices they invested in to safeguard the family were used maliciously to turn against them!

Similarly,  a family in Houston reported hearing sexual expletives from the baby monitor in their infant’s room. When they turned on the lights, the Nest camera in the room activated. And then a voice told them to turn off the lights and if they didn’t, it threatened that their infant would be kidnapped!

Nest is a Google-owned company but there have been several Nest users across US who have reported similar incidents.  And a large section of the world would think Google would secure their products well enough!

There are too many gizmos being manufactured with not enough attention to security.  Karl Sigler, threat intelligence manager at SpiderLabs, a team of ethical hackers at the Chicago-based cybersecurity company Trustwave, says,” One reason smart home devices may be vulnerable to hacking is that they are often developed by vendors who know how to manufacture a standard appliance, but aren’t as well-versed in how to securely connect it to the internet. The devices are also developed with convenience in mind, and manufacturers are sensitive about security steps that consumers may interpret as frustrating or a hassle.”

With the devices being used within intimate confines such as our homes, most consumers who are not tech-savvy don’t quite grasp the consequences of not securing them adequately, and that is one of the biggest challenges today.  My mother thought I was rambling until she heard about the fire that was caused by a smart toaster, or the food that can get spoilt by her refrigerator getting hacked!

Here is another chilling example – recently, Las Vegas casino’s high-roller database was accessed through a smart thermometer in a fish tank! ‘Interesting Engineering’ reported, “They then found the high-roller database and then pulled that back across the network, out the thermostat, and up to the cloud.”

More recently, Pen Test Partners posted online showing how two well-known car alarm brands could be hijacked, controlled thru just a smartphone by the attacker.  It is reported that these alarms were installed on some 3 million vehicles worldwide!

The smart market is getting wider – Walgreens is planning on a line of ‘smart coolers’, these coolers have cameras to scan shoppers’ faces to get demographic information. The technology can also do ‘iris tracking’, to provide retailers information on the displays that were most looked at. They plan to have them installed in stores across Chicago, San Francisco and New York.  Retailers want information of their buyers to segment them by age, gender, income and so on to do target marketing. These cameras can analyze faces to infer age, gender and so on, by the AI system thru micro-measurements, for e.g., the distance between their lips and nose,  and then estimate the age and gender of the person who opened the cooler door.  What these coolers do is analysis and not facial recognition.  Facial recognition in public is apparently outlawed under BIPA, the Biometric Privacy Act, in Illinois. (Facebook and Google have had to fight class-action suits under the law)

The RSA 2019 conference brought to attention that there are very serious security issues with smart-home gadgets and devices.  The threat is systemic and widespread.  The presenters gave demos on how to hack alarms, children’s dolls, their GPS tracking watches among other threats. There was a video released on how to hack a 3rd party smart car alarms and carjack moving vehicles!!!

As consumers, we are advised to watch out for the IP addresses accessing our smart home devices – every computer that accesses a device should have a unique numerical label that should show in the log. We should make sure the software on our devices is regularly updated so that it is equipped with the latest security patches. Most times it becomes difficult for us to notice when our smart device is compromised, it can just get slower, or it reboots automatically or just gets unresponsive.

https://www.marketsandmarkets.com/Market-Reports/smart-homes-and-assisted-living-advanced-technologie-and-global-market-121.html?gclid=EAIaIQobChMI3M_zvabZ4QIVxIBwCh1RyABeEAAYASAAEgJmg_D_BwE

https://www.detroitnews.com/story/business/2019/02/12/smart-home-devices-like-nest-thermostat-hacked/39049903/

https://interestingengineering.com/a-casinos-database-was-hacked-through-a-smart-fish-tank-thermometer

The Three Levels of Listening

by Betsy Westhafer

~Adapted from the book, “ProphetAbility – The Revealing Story of Why Companies Succeed, Fail, or Bounce Back” by Tony Bodoh and Betsy Westhafer

We hear through our paradigms. Test this within your own company by asking a cross-section of employees and leaders to read the same customer story and then tell you what the customer’s challenges were. Their answers will depend in part on:

  • their role in the organization
  • the KPIs that matter to their success
  • their time horizon of reference, and
  • their level of authority to resolve an issue.

Those who deal with transactions, like a retail clerk or customer service agent, will focus on the present and the specific customer in the story. The C-suite will focus on what it means for their department’s operations. The CEO is focused on what this means for the customer base now and into the future.

Having an awareness of this will help you see three different levels at which your team is listening and trying to address challenges. These levels are tactical, operational and strategic. We’ve been warned repeatedly about how siloed analysis and action between departments can lead to poor customer experiences, higher operational costs and failure to execute coordinated efforts. However, most organizations have blind spots that keep them from seeing how their tactics, operations and strategies are not synchronized, or worse, how they are sabotaging each other.

Tactical listening is typically a bottom-up (starting with the front-line employee) approach where the customer makes an inquiry or provides feedback to the company through an email, chat or voice conversation, survey, online review, or social media comment. Companies may monitor and respond to these communications, but most companies limit the number of “official” channels they will respond to and through. Regulatory and privacy issues are often a driver of such decisions, but not always. It is typically at this tactical level that we find voice of the customer programs (VoC) implemented with the goal of capturing, categorizing, reporting and analyzing customer ratings and comments. Most of the challenges detected in tactical listening are issues created by the features of the product or service, or the failure to deliver on a promised benefit.

These VoC programs vary dramatically in their success primarily because they are implemented as a tactical response tool. Goals include: 1) tracking how many times a particular issue was raised by customers and the average sentiment score of these mentions; 2) reducing the number of times the issue is mentioned; and, 3) improving the average sentiment score of these mentions. These tactical goals often create employee behavior that is irrational to the operational and strategic thinkers in the company, and they can be self-sabotaging. One particular case that comes to mind is that of a Fortune 500 company that removed its customer service telephone number from the website and declared a success when the volume of phone complaints decreased. This type of short sightedness is far too common.

One bright spot we do see is the emergence and growth of customer success teams and programs. These teams are often found in companies with a subscription-based revenue model because they realize that their customer’s success has an immediate and direct impact on their top and bottom lines. Software-as-a-Service (SaaS) companies have led the way in this arena. The reason we see this as promising stems from their approach to pushing out the horizon. They don’t wait for the customer to have a problem and then react. They are using a variety of data sources to detect when customers may be on the path toward a problem or simply not taking the best path to achieve success. Then, they proactively step in to provide some education or advice.

Operational listening is focused on understanding the challenges customers have a while doing business with the company. While there are a number of causes, some of the most common include siloed data, information and communications. Think of the challenges you have when moving from one channel to another while working with a vendor. The sales team promising more than can be delivered or failing to share their specific promises with the fulfillment team are also common issues. For large companies, we find that the silos between product lines or divisions can cause significant operational challenges.

Your customers are typically not experts in your business, nor should you expect them to be. This is one of the big problems with an operationally oriented VoC program. You may have teams looking only at one product or feature and never step back to consider the whole relationship the client has with you or how your products and service teams interact in ways that cause challenges for the customer.

Strategic listening is focused on understanding the challenges customers have within their life, profession or business that the company may or may not be attempting to resolve. The perennial example of this is the product development research conducted for Proctor and Gamble’s Swiffer product. Swiffer is essentially a disposal wet wipe on a stick for cleaning floors. They could have commissioned a thousand different tactical or operational listening studies and they would never have discovered the new product opportunity. The team had to go beyond what customers were complaining about related to their existing products, features and benefits (tactical). They had to look beyond the expertise they had as a company in existing product lines like floor detergents (operational). The team needed to step back to understand how they could reposition the company to solve a problem that their customer did not even realize they had—cleaning their house was dirty work. They did this by watching how women cleaned their homes. There are several accounts of the Swiffer story available online if you are not familiar with it. One that we like is, A Chain of Innovation: The Creation of Swiffer by Harry West.

It is rare, but not impossible to find strategic opportunities in bottom-up research. The rarity is due to the limited visibility the front line has of strategic matters, KPIs focused on tactical success and a time horizon that is laser-focused on the present. It is far more common to find strategic opportunities using top-down (CEO directly involved) research using methods that engage the C-suite of B2B or B2B2C customers.

Every CEO needs to be aware of the levels of research their company is involved in. They need to take responsibility for championing research at all levels and for initiating both bottom-up and top-down research. Missing out on these will create a scenario in which competitors can pose significant threats.

Betsy Westhafer is the CEO of The Congruity Group, a Customer Advisory Board consultancy based in the US. She is also the co-author of the #1 Best Seller, “ProphetAbility – The Revealing Story of Why Companies Succeed, Fail, or Bounce Back,” available on Amazon.

Mechanism of Modern Cascading Style Sheets (CSS)

By Sangavi Rajendran

In this article, you will learn about the mechanics and the practical uses of CSS that you will find valuable as a developer. To effectively use Cascading Style Sheets, you must keep in mind how cascading works within the browser.

  1. Beware of Margin Collapse:

Unlike most other properties, vertical margins are used to collapse when they meet. This means when the bottom margin of one element touches the top margin of another, only the bigger of the two survive. There are ways to overcome these behaviors, but it’s better to just work with it and use margins only going in one direction, ideally margin-bottom.

  1. Use Flexbox For Layouts

The flexbox model exists for some reason. Floats and inline-blocks work, but  they are all essentially tools for the styling documents, not websites. Flexbox is specifically designed to make it easy to create any layout exactly the way it was imagined. The set of properties that come with the flexbox model give developers lots of flexibility, and once you get used to them, doing any responsive layout is like a piece of cake. Browser support is almost perfect, so there shouldn’t be anything stopping you from going full flexbox.

  1. Do a CSS Reset

Although the situation has greatly improved over the years, there is still some variation in the way different browsers behave. The best way to resolve these issues are to apply a CSS reset that sets universal default values for all the elements, allowing you to start working on a clean style sheet that will deliver the same result everywhere .There are libraries like normalize.css and ress to do this very well, to correct all envisioned browser inconsistencies. If you don’t want to use a library, you can also do a clearer basic CSS reset yourself with styles: This may seem a bit harsh, but nullifying margins and paddings makes laying out elements much easier as there are no default spaces between them to take into account. The box-sizing: border-box; property is another good default, which we will talk about more in our next tip.

  1. Border-box for All

Most beginners don’t know about the box-sizing property but it’s important. The best way to understand it’s two possible values:

content-box (default) – If we set width/height to an element, that’s just the size for it’s content. All paddings and borders are on top of that. e.g. a <div> has a width of 100 and padding of 10, our element will take up to 120 pixels (100 + 2*10)

Border-box – The padding and borders are included in the width/height. A <div> with width: 100px; and box-sizing: border-box; will always be 100 pixels wide no matter what paddings or borders are added.

Setting border-box to all the elements makes it so easier to style everything, since you don’t have to do math all the time.

  1. Images as Background

When adding images to your design, if it’s going to be responsive, use a <div> tag with the background CSS property alternative of <img> elements. This may seem like more work for nothing, but it makes it easier to style images clearer, for keeping their original size and aspect-ratio, thanks to background-size, background-position, and other properties. A small drawback of this technique is that the web accessibility of your page will take a slight hit, as images won’t be drag properly by screen readers and search engines. This issue can be resolved by the object-fit, but it doesn’t have full browser support.

6.Write Better Comments

CSS might not be a programming language still it’s code needs to be documented. Some simple comments are all it takes to organize a style sheet and make it more accessible to all. For the larger sections of CSS such as major components or media-queries, use a stylized comment and leave a couple of new lines after. Remember that CSS doesn’t have single line // comments, so when commenting something out you still need to use the / / syntax.

  1. Everyone Loves kebab-case

Class name and id should be written with a hyphen (-) when they contain more than one word. CSS is case-insensitive so camelCase is not an option. Earlier, underscores were not supported (they are now) which made dashes the default convention. When it comes to naming, you may also consider BEM, which follows set of principles which add consistency and provides a component-based approach to development. 

8.Don’t Repeat Yourself

The values for most CSS properties are inherited from the element one level up in DOM tree, hence the name Cascading Style Sheets. for example, font property- it is almost always inherited from the parent, you don’t have to set it again separately for each and every element on the page. Simply add font styles that will be most prevalent in your design to the <html> or <body> element and let them trickle down. Later, you can always change the styles for given element. What we are saying is to avoid repetition and use inheritance as much as possible.

9.CSS Animations with transform

Don’t animate elements by directly changing their width and height, or left/top/bottom/right. It’s preferred to use transform() property as it provides smooth transitions and triggers your intentions to make it easier to understand while reading the code. It Transform, as well as all of its many functions (translate, rotate, scale, etc.) have universal browser compatibility and that can be used freely.

10.Em, Rem, and Pixel

There is a lot of confusion whether people should use em, rem, or px values for setting the size of elements and text. Truth is, all three options are  applicable and have their pros and cons.

em – The value of 1em is relative to the font-size of direct parent. Also used in media-queries, em is great for responsiveness, but it can get really confusing tracing back the exchange rate of ems into pixels for each element (1.25em of 1.4em of 16px = ?).

rem – Relative to the font-size of <html> element, rem makes it  clearer flexible to scale all headings and paragraphs on the page. Leaving the <html> with it’s default fontsize and setting everything with rem is a great approach accessibility-wise.

px – Pixels give us precision but does not  offer any scaling when used in responsive designs. It is reliable, easy to understand, and present a good visual connection between value and actual.

Most of the times em and rem can save you a lot of work, particularly when building responsive pages.

  1. Validate

Validating CSS is not important as validating HTML or JavaScript code, but running your code through a CSS Linter can still be very useful. It will tell if you’ve made any mistakes, warn you about bad practices, and give general tips for improving the code.

Just like minfiers and autoprefixers, there are plenty of free validators available:

Online tools: W3 Validator, CSS Lint

Text editor plugins: Sublime Text, Atom

 Conclusion               

Modern CSS is easy to get overwhelmed by many different methodologies, you should think of them as different possible tools you can use when you face a sufficiently complex  CSS codebase. You have the properties necessary to identify an effective visual layout and then bring it to fruition. By following these techniques we can eliminate the browser compatibility issues and other common css fixes.

BigData Testing – Challenges, Processes & Best Practices

By Manoranjitham Vetriveeran

Big Data Testing is a trending topic in the Software Industry, its various properties like volume, velocity, variety, variability, value, complexity and performance puts forth many challenges. On click of a button we generate megabytes of data.

Testing such large collections of various data types ranging from tables to texts and images is a challenge.

With this article, we would study about what is Big Data, its characteristics, its importance, processes involved, and different aspects of testing, the challenges, how to secure the data and the best practices.

Big Data – What is it?

Big Data is a collection of data sets which are large and complex and are difficult to process and does not fit well into tables and that responds poorly to manipulation by SQL.

Suppose we have a 100 MB doc which is difficult to send, or a 100 MB image which is difficult to view, or a 100 TB video which is difficult to edit. In any of these instances, we have a Big Data problem. Thus, Big data can be system specific.

Big Data not only deals with the size of the data and is related to the 4 V’s.

  • Volume (scale of data)
  • Velocity (different forms of data)
  • Variety (analysis of Streaming data in microseconds)
  • Veracity (Certainty of data)

Big Data comes in different sizes and formats. Hence, the three different categories:

  • Structured
  • Unstructured
  • Semi-structured data

What are the aspects of Big Data Testing?

A very strong test data and QA environment are required to ensure error-free processing of data. Some of the key aspects of Big Data testing are as follows;

Big Data testing can be performed in two ways.

Functional Testing

Functional testing is performed to identify data issues because of coding errors or node configuration errors, while non-functional testing focuses on performance bottlenecks and validates the non-functional requirements. Testing should include the below four phases;

Validation of pre-Hadoop processing

Data is extracted from various sources such as web logs, social media, RDBMS, etc., and uploaded into HDFS (Hadoop Distributed File System – This article considers Hadoop ecosystem). This can be appropriately used on other Big Data ecosystems as well). Here we need to ensure it is extracted properly and uploaded into correct HDFS location. Also, validation of the file partition and replication into different data nodes.

Validation of MapReduce Process

Testing of business logic on a single node than on a set of nodes or multiple nodes to ensure valid generation of the “key-value” pair. Validation of aggregation and consolidation of data after reduced operation. Now compare the output generated data with the input files to ensure the generated output file meets all the requirements.

Validation of Extract-Transform-Load Process

Last stage of testing where data generated by the previous stage is first unloaded and then loaded into the repository system. Inspection of data aggregation to ensure there is no data corruption and it is loaded into the target system.

Reports Validation

Validation of reports deals with required data and all indicators are displayed correctly.

  • Non-Functional Testing

Performance Testing

This is performed to obtain the metrics of response time, data processing capacity, and speed of data consumption. It is conducted to access the Performance limiting conditions which causes performance problems.

Verification of data storage at different nodes and testing the JVM parameters must be involved. Test the values for connection timeout and query timeout.

Failover testing

Verification of seamless processing of data in case of data nodes failure and validation of the recovery process on switching to other data nodes.

Big Data Testing – Challenges

Understanding the data and its impact on the business is the real challenge. Also, dealing with unstructured data drawn from sources such as tweets, text documents and social media posts is one of the biggest challenges.

Testers need to understand business rules and the statistical correlation between different subsets of data. Major attention in big data testing include:

  • Data security and Scalability of the data storage media.
  • Performance issues and the workload on the system due to huge data volumes.

Big Data Testing – Best Practices / quick to-do list

To overcome this challenge, Quality assurance and testing professionals must move ahead to understand and analyze challenges in real time. Testers must be capable of handling data structure layouts, processes, and data loads.

  • Tester must avoid sampling approach. It may look easy and scientific, but is risky. It’s better to plan load coverage at the outset, and consider deployment of automation tools to access data across various layers.
  • Testers need to derive patterns and learning mechanisms from drill-down charts and aggregate data.
  • If the tester has good experience on programming languages, it would definitely help on map-reduce process validation.
  • Ensure the right-time incorporation of changes in requirements. This calls for continuous collaboration and discussions with stakeholders.

How to Secure the Big Data

Big data applications work on data from different sources and data travels extraordinarily fast across the globe, so security testing is another important aspect.

Once we’ve got the data, the difficulty exists not only in analyzing the massive data lake to get key insights but the security of this large volume of data must be ensured while developing the application.

From a security point of view, there are more number of risks such as unauthorized access, privilege escalation, lack of visibility and many more.

  • Unauthorized access can put sensitive and highly confidential data at risk of theft and loss. We must have a centralized control over big data.
  • Over-privileged accounts raise insider threats. Admin shouldn’t have complete access to Hadoop clusters. Instead of giving full access, it should be restricted to the specific commands and actions required.
  • Organizations must establish trust by using Kerberos Authentication while ensuring conformity to predefined security policies.
  • To overcome the NoSQL injection, must start by encrypting or hashing passwords, and ensure end-to-end encryption by encrypting data at rest using algorithms.
  • To detect unauthorized file modifications by malicious server agents, a technique called secure untrusted data repository is to be used.
  • In addition to an antivirus software, organizations should start using trusted certificates and connect only to trusted devices on their network by using a mobile device management solution
  • The incorporation of Hadoop nodes, clusters, applications, services, and users into Active Directory permits IT to give users granular privileges based on their job function.

However, there is a solution to every possible problem that emerges. All we need to do is identify the effective and suitable solution.

Conclusion

Big data is making a rapid move and is going to transform how we live, how we work, and how we think. To be successful, testers must learn the components of Big data ecosystem from scratch. Applying right test strategies and following best practices would help ensure qualitative software testing. The idea is to improve the big data testing quality which will help to identify defects in early stages and reduce overall cost.

Smart Dust – Microelectromechanical systems (MEMS)

By Saviour Nickolas Derel J

Imagine a world where wireless devices are as small as a grain of salt. These miniaturised devices have sensors, cameras and communication mechanisms to transmit the data they collect back to a base in order to process. Today, you no longer need to imagine it: microelectromechanical systems (MEMS), often called motes, are real and they very well could be coming to a neighbourhood near you. Whether this fact excites or strikes fear in you it’s good to know what it’s all about.

What can smart dust do?  

Outfitted with miniature sensors, MEMS can detect everything from light to vibrations to temperature. It is a tiny dust-size device with extraordinary capabilities. With an incredible amount of power packed into its small size, MEMS combine sensing an autonomous power supply, computing and wireless communication in a space that is typically only a few millimetres in volume. Being such a small size, these devices can stay suspended in an environment just like a particle of dust and it is very hard to detect.

Smart Dust is useful in monitoring real world phenomenon without disturbing the original process;

  • Collect data including acceleration, stress, pressure, humidity, sound and more from sensors
  • Process the data with what amounts to an onboard computer system
  • Store the data in memory
  • Wirelessly communicate the data to the cloud, a base or other MEMs

Working principle of Smart Dust

Smart Dust motes are run by microcontrollers. These microcontrollers consist of tiny sensors for recording various types of data. Timers are used to run these sensors. These sensors do the job of collecting the data. The data obtained are stored in its memory for further interpretations. It can also be sent to the base controlling stations.

Corner Cube Retroreflector (CCR) has three mutually orthogonal planar mirrors. Any incident light within a solid angle is reflected in the direction of the incident light. By moving one of the mirrors, the CCR can be used to modulate the incident ray of light at kilohertz rates. The microfabricated CCR includes an electrostatic actuator that can deflect one of the mirrors at kilohertz rate. Hence the external light source can be transmitted back in the form of the modulated signal at kilobits per second. It can transmit to the bus only when the CCR body diagonal happens to point directly towards the bits, within a few tens of degrees.

Although a passive transmitter can be made more omnidirectional by employing several CCRs oriented in different directions, at the expense of increased dust mote size.

Practical applications of smart dust

The potential of smart dust to collect information about any environment in incredible detail could impact plenty of things in a variety of industries from safety to compliance to productivity. It’s like multiplying the internet of things technology millions or billions of times over. Here are just some of the ways it might be used:

  • Monitor crops in an unprecedented scale to determine watering, fertilisation and pest-control needs.
  • Monitor equipment to facilitate more timely maintenance.
  • Identify weaknesses and corrosion prior to a system failure.
  • Enable wireless monitoring of people and products for security purposes.
  • Measuring anything that can be measured nearly anywhere.
  • Enhance inventory control with MEMS to track products from manufacturing facility shelves to boxes to palettes to shipping vessels to trucks to retail shelves.
  • Possible applications for the healthcare industry are immense from diagnostic procedures without surgery to monitoring devices that help people with disabilities interact with tools that help them live independently.
  • Researchers at UC Berkeley published a paper about the potential for neural dust, an implantable system to be sprinkled on the human brain, to provide feedback about brain functionality.

 Disadvantages of smart dust

There are still plenty of concerns with wide-scale adoption of smart dust that need to be sorted out. Here are a few disadvantages of smart dust:

  • Privacy concerns – Many that have reservations about the real-world implications of smart dust are concerned about privacy issues. Since smart dust devices are miniature sensors, they can record anything that they are programmed to record. Since they are so small, they are difficult to detect. Your imagination can run wild regarding the negative privacy implications when smart dust falls into the wrong hands.
  • Control – Once billions of smart dust devices are deployed over an area it would be difficult to retrieve or capture them if necessary. Given how small they are, it would be challenging to detect them if you weren’t made aware of their presence. The volume of smart dust that could be engaged by a rogue individual, company or government to do harm would make it challenging for the authorities to control if necessary.
  • Cost – As with any new technology, the cost to implement a smart dust system that includes the satellites and other elements required for full implementation is high. Until costs come down, it will be technology out of reach for many.

What should you do to prepare?

According to researchers, the smart dust hypothesis of monitoring every element of our earth will be highly beneficial to humankind. Certain organizations have already introduced sensors on different parking areas and highroads in San Francisco. The sensors are integrated with magnetometers to assess and sense if any big metal object is sitting on a specific spot, identifying a car.

The entities who have led the development of smart dust technology since 1992 and large corporations such as General Electric, Cargill, IBM, Cisco Systems and more who invested in research for smart dust and viable applications believe this technology will be disruptive to economies and our world.

At this moment many of the applications for smart dust are still in the concept stage. In fact, Gartner listed smart dust technology for the first time in its Gartner Hype Cycle in 2016. While the technology has forward momentum, there’s still quite a bit to resolve before you will see it impacting your organisation. However, it’s important to pay attention to its trajectory of growth, because it’s no longer the fodder of science fiction. We might not know when it will progress to the point of wide-scale adoption, but we certainly know it’s a question of when rather than if.