Journal ID : TRKU-10-08-2021-11475
[This article belongs to Volume - 63, Issue - 06]
Total View : 368

Title : BIG DATA TESTING - CHALLENGES AND BEST PRACTICES

Abstract :

In recent times, the term ‘Big Data’ has been under the limelight due to its exponential increase in relevance and importance in small, medium, large and very large companies. Industries of divergent sectors such as education, health, agriculture and telecommunication are all leveraging on the power of Big Data to enhance business prospects along with improved customer experience. Testing such highly volatile data, which is unstructured data and generated from myriad sources such as web logs, radio frequency Id (RFID), sensors embedded in devices, which are quite challenging in order to derive maximum benefit in information processing and decision making from the use of big data, such data must be of acceptable quality and must be fairly usable in terms of interloper- ability, relevance, and accuracy. However, such data quality can only be guaranteed if systems from which these data are harvested are adequately tested to ensure that output data from such systems exhibits minimum big data quality standards and characteristics. The methods adopted include Test criterion and cases which consider the volatility of big data and its underlying characteristics which is not limited to, but include Volume, Velocity, Veracity and Variety. Testing such highly volatile data, which is unstructured. Hadoop and Map Reduce which are also tools for testing big data. One of the most challenging endeavour for a tester is how to keep pace with changing dynamics of the industry. This work discusses inherent challenges faced in big data testing and the respective best practices that can be adopted in big data testing to enhance big data quality and accuracy.

Full article

//