Community Articles

Find and share helpful community-sourced technical articles.
avatar
Master Guru

Most code for current big data projects and for the code you are going to write is going to be JVM based (Java and Scala mostly). There is certainly a ton of R, Python, Shell and other languages. For this tutorial we will focus on JVM tools.

The great thing about that is that Java and Scala Static Code Analysis Tools will work for analyzing your code. JUnit test are great for testing the basic code and making sure you isolate out functionality from Hadoop and Spark specific interfacing.

General Java Tools for Testing

Testing Hadoop (A Great Overview)

Example:

I have a Hive UDF written in Java that I can test via Junit to ensure that the main functionality works.

(See: UtilTest)

import static org.junit.Assert.assertEquals;
import org.junit.Test;	
       /**
	 * Test method for 
	 * {@link com.dataflowdeveloper.deprofaner.ProfanityRemover#fillWithCharacter(
         * int, java.lang.String)}.
	 */
	@Test
	public void testFillWithCharacterIntString() {
		assertEquals("XXXXX", Util.fillWithCharacter(5, "X") );				
	}

As you can see this is just a plain old JUnit Test, but it's one step in the process to make sure you can test your code before it is deployed. Also Jenkins and other CI tools are great at running JUnits are part of their continuous build and integration process.

A great way to test your application is with a small Hadoop cluster or simulated one. Testing against a Sandbox downloaded on your laptop is a great way as well.

Testing Integration with a Mini-Cluster

Testing Hbase Applications

Testing Apache NiFi Processors

Testing Apache NiFi Scripts

Testing Oozie

Testing Hive Scripts

Testing Hive UDF

Using org.apache.hive.pdk.HivePdkUnitTest and org.apache.hive.pdk.HivePdkUnitTests in your Hive plugin so that it will be included in unit tests.

Testing Pig Scripts

Testing Apache Spark Applications

Testing Apache Storm Applications

9,657 Views