At Butler/Till, we take immense pride in our independent, women-owned and led status, our unwavering commitment to a purpose-driven approach, and our unique structure as a 100% employee-owned company.
Our dedication to Diversity, Equity, Inclusion, and Belonging (DEIB) is a cornerstone of our culture. We believe that the
diversity and inclusivity of our workforce are sources of strength. As you become part of our community, you’ll discover
that we are dedicated to creating a positive impact, not only for our clients but also for the communities where we live and
work.
SUMMARY
The Data Engineer is a hybrid role, combining the core competencies of data engineering with business intelligence
techniques to support our client’s reporting needs with high agility. The role works closely with information analysts to plan
and support the creation of reports, analysis, extracts, and data marts that feed directly into Power BI. This position
concentrates on SQL, Python, and data processing within cloud environments (Snowflake, Azure), but also understands both
Power BI and the marketing platforms that the source data derives from for Quality Assurance purposes. This role works to
automate ad hoc processes, document ETL steps, and manage database schemas as well as acts as a mentor to junior level
staff, performing training, code reviews, and providing feedback to improve their deliverables.
IMPROVED ORGANIZATION & PRODUCTIVITY
Outcome: Effective data management fuels organization, productivity, and automation. Enabling internal consumers
access to trusted data for running queries, developing strategic insights, and delivering standardized outputs. Adding
value to our external customers and improving the bottom line.
• Collaborates with the Director, and with internal and external stakeholders, to capture requirements,
architect, scope, standardize, and document Butler/Till’s data frameworks.
• Writes SQL scripts for creation, support, and automation of standardized, scalable, and client specific data
marts.
• As opportunities arise, works with other departments to create solutions to improve agency efficiency.
• Works closely with B/T’s Analysts, Media, and Account teams to understand our clients’ data needs and
the standardization of campaign/ad naming conventions across clients.
• Maintains documentation on existing data policies, database schemas, metadata and ETL processes and
creates proof of concept (POC) designs for strategic initiatives.
INCREASED RELIABILITY
Outcome: Data Engineering minimizes potential errors by establishing processes and policies for usage and building trust
in the data being used to make decisions across the organization. With reliable, up-to-date data, our teams can respond
more efficiently to market changes and customer needs.
• Support QA processes of data, acting as a 1st line of defense of data integrity.
• Partner with Software Engineering and Data Management for resolving tickets and processing files.
• Maintain data flows in 3rd party applications that monitor data inconsistencies and deficits and alerts the
team for corrective action.
• Act as a subject matter expert with respect to B/T’s data marts, along with the marketing platforms
(Google, Facebook, etc.), and how that data is utilized in marketing analytics.
• Provides guidance to junior level Data Engineers on QA best practices with live code reviews.
REDUCED RISK
Outcome: Effective data engineering protects our enterprise from data losses, thefts, and breaches with authentication
and encryption tools. Strong data security ensures that vital company information is backed up and retrievable should
the primary source become unavailable. Also, security becomes more important if data contains personally identifiable
information that must be carefully managed to comply with consumer protection laws.
• The Data Engineer is familiar with data collection policies, both nationally and internationally (i.e., CCPA
(California Consumer Privacy Act), GDPR).
• Follows policies and standards for all supported data processes.
• Compiles and analyzes data, processes, and database code to troubleshoot problems and identify areas
for improved security.
SCALABILITY
Key Outcome: Data management allows organizations to effectively scale their usage of data with repeatable processes to
keep data and metadata up to date. When processes are easy to repeat, organizations save time, money, and gain
efficiency.
• Prepare data for analytics and modeling, working closely with Information Analysts for creating client
specific data marts that are easily repeatable for new and existing clients.
• Works in no/low code solutions (such as Microsoft Power Automate & Excel), SQL, and Python to support
scalable, full life cycle solutions to meet business needs.
• Research opportunities to enhance the existing tech stack. Creates well-documented requirements for
internal or external stakeholders.
• Testing new software functionality (internal or external) and working with data engineering colleagues
toward making agency-wide recommendations on future-proofing B/T’s offering.
• Working with internal stakeholders to determine current and future needs and bringing them to data
engineering colleagues for future projects.
REQUIRED:
• Bachelor’s degree in Computer Science, Information Technology, Statistics, Mathematics,
Finance, or Economics.
• 3+ years of data engineering experience.
• Detail-oriented with strong organizational, communication, computer, and office procedure
skills.
• Excellent knowledge of SQL, database schemas, and analytics data marts.
• Experience working with Snowflake with admin-level permissions.
• Experience with data discovery, analytics, and BI software (e.g., Tableau, Qlik, PowerBI).
• Proficiency in Git and/or version control software (e.g., GitHub, GitLab, BitBucket).
• Familiarity with object-oriented programming languages (e.g., Python) and willingness to
expand skills.
• Ability to understand and anticipate risks/dependencies across deliverables.
• Willingness to continuously learn and develop skill sets.
PREFERRED
• Experience mentoring junior-level resources.
• Familiarity with SaaS systems (i.e., Datorama) and cloud environments (Azure, AWS, etc.).
• Passion for problem-solving and providing workable solutions.
• Experience working with both normalized and unnormalized data structures.
• Strong analytical and reasoning skills with an ability to visualize processes and outcomes.
CORE COMPETENCIES
• CUSTOMER FOCUS/CLIENT INTIMACY: seeking to understand client business challenges/needs and delivering
continuous value to our clients
• COLLABORATIVE: working with teams and across the organization with ease
• OWNER AGILITY: able to continuously learn and quickly adapt to changing circumstance
• RESULTS DRIVEN: accepts accountability to deliver business outcomes, even under changing circumstances. Delivers
on commitments
• DISCIPLINED: Thinks, plans, and prioritizes work on an ongoing basis, plans and aligns with key team members before
acting