Big Data Engineer - Brief Description: Developers in this role will work directly with analysts in a purposeful, mission-focused effort to develop big data analytic tools aimed at turning raw data into meaningful intelligence for the War Fighter. The Big Data engineer shall work within a small software development team to build scalable, predictable, high-quality and high-performance tools that can process, correlate, and disseminate large volumes of data for specific mission needs using Open Source technology. The developer will work in an extreme programming environment to automate and refine existing mission processes quickly. Development will occur within short Agile development cycles in collocated spaces with analysts. For the right candidate this is the opportunity to work on a senior big data development team and learn new techniques in state of the art cloud environments while contributing significantly with their current technical skill set. Some telework allowed. Description: Be a part of a software development team building big data prototypes on AWS in support of a critical mission requirement. Use critical thinking and a variety of software tools and languages to tackle new challenges each day. Interact in a collaborative rapid feedback environment to deliver process improvements to the end user. Strong collaboration and communication skills are required in this environment. Frequent interaction between the analysts and engineers is key in this opportunity. Experience collecting and processing large volumes of data in various formats, structured and unstructured. Ability to understand and translate technical jargon to mission-users. Required knowledge, skills and abilities: Experience in full stack application development using Java or Groovy (mid-level and senior positions available) Experience interrogating various data stores, such as Accumulo, MongoDB, MySQL, and Oracle a plus. Experience with one or more scripting languages like Python or Perl a plus. Experience with search technologies such as ElasticSearch, Solr, and Lucene a plus. Experience sharing and disseminating search results using tools such as AngularJS, JQuery, Grails, Java, and REST web services a plus. Experience creating MapReduce jobs to process large amounts of data with Hadoop and other Hadoop Components – HDFS, Hbase, Hive, Sqoop, Kafka, Storm, etc. a plus. (Cloudera experience is a plus) Knowledge of Amazon Web Services (AWS) Experience creating data models to translate data in various formats for ingestion into an enterprise data store Extreme Programming techniques. Proven experience with the technique is a plus. Experience with maintaining continuous test and deployment environments to support rapid application development. Strong knowledge of software best practices to enhance the security posture of the application. Credentials and Experience US Citizenship required. Bachelor of Science in Computer Science or related major from an accredited university. Minimum of two years experience in a technical environment exhibiting the knowledge, skills and abilities above. Cloud technologies awareness Desired Skills: Selected candidate shall be submitted for a TS SCI, Nice to have: Active TS with SCI Get on board! Evans & Chambers is a progressive IT solutions provider located in Washington, DC. We offer excellent salaries and benefits in a fun and energized work environment. We are committed to rewarding goal-oriented professionals who enjoy meeting challenges head-on. http://www.evanschambers.com