Data Scientist Intern bei Plymouth Rock
Plymouth Rock · Woodbridge, Vereinigte Staaten Von Amerika · Hybrid
- Junior
- Optionales Büro in Woodbridge
DESCRIPTION
As a Data Scientist Intern, you will work on cutting-edge analytical and data engineering projects that drive measurable business impact across pricing, underwriting, marketing, and claims.
This internship is ideal for a technically curious, motivated problem-solver who wants hands-on data science experience.
RESPONSIBILITIES
- Support the design, construction, and optimization of robust data pipelines to enable machine learning and analytical modeling.
- Contribute to the design and implementation of data and ML workflows using orchestration tools such as Dagster, Airflow, or similar frameworks.
- Help implement data quality checks, validation routines, and monitoring for automated data workflows.
- Assist in organizing and managing internal GitHub repositories to standardize ML project structures and best practices.
- Collaborate with data scientists and engineers to automate the ingestion, transformation, and delivery of data for model development.
- Contribute to initiatives migrating analytical processes into cloud-based data lake architectures and modern platforms such as AWS or Snowflake.
- Develop reusable and well-tested code to support analytical pipelines and internal tools using Python and SQL.
- Conduct data mining, cleansing, and preparation tasks to build high-quality analytical datasets.
- Participate in model development, including data profiling, model training, validation, and interpretation.
- Build and evaluate predictive models that enhance profitability through improved segmentation and estimation of insurance risk.
- Assist in studies evaluating new business models for customer segmentation, retention, and lifetime value.
- Collaborate with business leaders to translate insights into operational improvements and cost efficiencies.
QUALIFICATIONS
- Currently pursuing or recently completed a Master’s in Data Science, Computer Science, Statistics, Economics, or related field.
- Proficiency in Python (Pandas, NumPy, Scikit-learn, XGBoost, or PyTorch) and SQL.
- Understanding of data engineering concepts, ETL/ELT workflows, and machine learning deployment.
- Exposure to workflow orchestration tools (e.g., Airflow, Dagster, Prefect) and Git/GitHub for collaborative development.
- Familiarity with Docker, CI/CD pipelines, and infrastructure-as-code tools such as Terraform preferred.
- Knowledge of AWS cloud services such as S3, Lambda, EC2, or SageMaker a plus.
- Experience with common modeling techniques (e.g., GLM, tree-based models, Bayesian statistics, NLP, deep learning) through coursework or projects.
- Strong analytical, communication, and problem-solving skills.
- A self-starter mindset, with attention to detail and enthusiasm for learning new technologies.
SALARY RANGE
The pay range for this position is $35 hourly.
ABOUT THE COMPANY
The Plymouth Rock Company and its affiliated group of companies write and manage over $2 billion in personal and commercial auto and homeowner’s insurance throughout the Northeast and mid-Atlantic, where we have built an unparalleled reputation for service. We continuously invest in technology, our employees thrive in our empowering environment, and our customers are among the most loyal in the industry. The Plymouth Rock group of companies employs more than 1,900 people and is headquartered in Boston, Massachusetts. Plymouth Rock Assurance Corporation holds an A.M. Best rating of “A-/Excellent”.
#LI-DNI
Jetzt bewerben