- Professional
- Oficina en Memphis
Job Description:
The Data Engineer is responsible for designing, developing, documenting, and supporting standardized and/or customized solutions and reporting from Evolve’s various data sources.
Main Job Tasks & Responsibilities:
- Plan data sourcing, integration transformation, and extraction process by developing common definitions of sourced data.
- Design common keys in physical data structure.
- Establish data integration and mapping specifications; examining data applications; examining data models and data warehouse schema.
- Determine best-fit data interchange methods.
- Assess middleware tools for data integration, transformation, and routing; developing project scope and specifications.
- Identify factors that negatively impact integration.
- Forecast resource requirements, establishing delivery timetables.
- Design and build complete data engineering solutions to ingest, move, transform, deliver, and present data for operational use. Engineer for resiliency, performance, and observability.
- Validate development work by developing and executing test plans and scenarios including data design, tool design, data extract/transform, networks, and hardware.
- Present data in a way that is most meaningful and readily accessible to the requestor using a combination of application solutions and system reports.
- Evalaute the requirements and select the right tools to use including DataBricks Notebooks, Azure Data Facory pipeline, stored procedures using T-SQL, and other means as appropriate to efficiently and effectively fulfill the requirements.
- Demonstrate outstanding communication skills to translate requests in order to accurately meet the actual information and deadline needs of users.
- Educate requestors on appropriate and desirable parameters to ensure they get the information they need.
- Collaborate with end users to gather requirements and ensure proper testing and validation. Provide ongoing support to end users on data solutions.
- Work closely with IT and InfoSec to ensure compliance with security requirements.
Education & Experience:
- Bachelor’s degree with at least 5-7 years of relevant working experience in data engineering especially in the Big Data and Microsoft/Azure space
- Training and direct experience in data engineering and using tools such as DataBricks, PySpark, Python, Parquet, TSQL, Azure Data Factory, SQL Agent, Azure Synapse, Power BI, and Excel.
- Expert level knowledge and extensive working experience with MS SQL database and writing SQL and T-SQL queries and stored procedures
- Working knowledge of database architecture and design, data warehouse, data extraction, data modeling, and analysis
- Extensive working experience with Azure environment a must, experience with other Azure tools like Azure Data Factory, Logic Apps, Functions Apps, andAzure Storage.
- Experience with other No-SQL databases like Cosmo and MongoDB a plus.
- Communicate effectively verbally and in writing to technical and non-technical users.
- Able to handle detail-oriented work while meeting schedules and deadlines.
- Ad-hoc reporting, Requirements analysis, Database management, Business Intelligence Tools, data modeling.
Key competencies:
- Communicate effectively verbally and in writing to technical and non-technical users.
- Able to handle detail-oriented work while meeting schedules and deadlines.
- Ad-hoc reporting, Requirements analysis, Database management, Business Intelligence Tools, data modeling.