Job Description:
We are seeking a passionate data engineer to develop a robust, scalable data model and optimize the consumption of data sources required to ensure accurate and timely reporting for Amazon's businesses.
You will share in the ownership of the technical vision and direction for advanced reporting and insight products. You will work with top-notch technical professionals developing complex systems at scale and with a focus on sustained operational excellence.
Key Responsibilities:
* Design, implement, and support a platform providing secured access to large datasets.
* Interface with tax, finance, and accounting customers, gathering requirements and delivering complete BI solutions.
* Collaborate with Finance Analysts to recognize and help adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
* Model data and metadata to support ad-hoc and pre-built reporting.
* Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
* Tune application and query performance using profiling tools and SQL.
* Analyze and solve problems at their root, stepping back to understand the broader context.
* Learn and understand a broad range of Amazon's data resources and know when, how, and which to use and which not to use.
* Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
* Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
* Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.
BASIC QUALIFICATIONS
* 1+ years of data engineering experience
* Experience with SQL
* Experience with data modeling, warehousing, and building ETL pipelines
* Experience with one or more query languages (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
* Experience with one or more scripting languages (e.g., Python, KornShell)
PREFERRED QUALIFICATIONS
* Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
* Experience with any ETL tool like Informatica, ODI, SSIS, BODI, Datastage, etc.