about 1 month ago
As a Senior Data Engineer in the Aero Services organization, you will be responsible for designing, building, and maintaining our Big Data platform and massive data stores on which our data analytics applications and tools run on. Key to the growth of the Aero Services Org will be our ability to monetize critical data that is produced by our suite of Aerospace products. Main Responsibilities:Design, develop, and maintain the Big Data platform that is fault-tolerant and scalableDesign, develop, and maintain cross-platform ETL processes and maintain dimensions and reference lookup dictionariesDevelop guidelines, standards, and processes to ensure the highest data quality and integrity in the data stores residing on the Big Data platformParticipate in setting strategy and standards through data architecture and implementation leveraging big data and analytics tools and technologiesWork closely with data scientists and product managers to understand their data requirements for existing and future projects on data analytics applicationsWork with IT and data owners to understand the types of data collected in various databases and data warehouses and define the migration strategy to move existing data into the Big Data platform Additional Attributes:Keen business acumen to recognize and recommend cost-effective and scalable platform solutions that best meet our business needsAbility to execute projects using an agile approach in a multi-disciplinary, matrixed environmentComfortable working in a dynamic, research and development environment with several ongoing concurrent projectsEnjoys exploring and learning new technologies You Must Have:Bachelor degree in computer science, IT, engineering, or other relevant field with a minimum of 5 years of data management experienceMinimum of 3 years of experience in designing, deploying, and supporting Big Data systems and solutionsMinimum of 3 years of experience in migrating data from data sources (MS SQL, Oracle, MySQL etc.) into Hadoop platform using Hadoop frameworks (Spark, Hive, Pig, Sqoop, Flume etc)Minimum of 2 years of experience in scripting languages (Perl, Python, Java etc)Minimum of 2 years of experience in NoSQL solutions (Hbase, Cassandra, MongoDB, CouchDB etc.) and managing unstructured dataMust be a US Citizen due to contractual requirements. We Value: Masters or PhD degrees in computer science, IT, engineering or relevant fieldsCertification in Hadoop and other big data tools and technologiesExperience with open source data processing frameworksExperience with data management on public cloud hosting servicesExperience with predictive analyticsExperience with web analytics and managing social media data streamsExperience with Agile software development methodologyAbility to work in a fast-paced and ambiguous environment
Honeywell's innovative technologies are making our world cleaner and more sustainable, more secure, connected, energy efficient, and productive.
Honeywell invents and manufactures technologies that address some of the world's most critical challenges around energy, safety, security, productivity and global urbanization. We are uniquely positioned to blend physical products with software to support connected systems that improve homes, buildings, factories, utilities, vehicles and aircraft, and that enable a safer, more comfortable and more productive world. Our solutions enhance the quality of life of people around the globe and create new markets and even new industries.
14 days ago
only 14 days until close
13 days ago
only 15 days until close
13 days ago
only 15 days until close
North America, United States, Georgia (U.S. State)
$10k - $15k