Job Description

Description:

Responsible for the development and integration of new or existing applications into the technical infrastructure and existing business processes.

Provides technical or functional guidance to project or work teams as needed within a specific discipline.

Collaborates on an ongoing basis with the Business Systems Analyst. Analyzes, designs, develops, tests, debugs, implements, maintains, and/or enhances existing or new systems/ web pages that are reliable and efficient.

Responsible for the writing and updating full-stack applications on existing hardware and business processes. Performs Power BI dashboard development, SQL data structuring, API integrations, workflow automation, and cloud development.

Experience with Agile methodology, software lifecycle, and scrum ceremonies.



Education:

Bachelor's degree in Computer Science, Information Systems, or a related field required.

Certifications in AWS, Microsoft Power BI, Salesforce, IT Project Management (PMP, Agile, SAFe) preferred or DevOps methodologies preferred.



Relevant Experience:

6-10 years of experience in application development, cloud computing, and enterprise software architecture.



Responsibilities:

- Lead development and integration efforts for new or existing applications.

- Design, develop, and maintain Power BI dashboards, SQL databases, or full-stack applications.

- Troubleshoot complex system issues and provide solutions.

- Mentor junior developers and assist in Agile project planning.

- Ensure compliance with industry standards, security policies, and best practices.



Other:

- Proficiency in SQL, Python, Power BI, Java, .NET, AWS, and cloud-based technologies.

- Knowledge of DevOps practices, cloud automation, ERP integration, CI/CD pipelines, and microservices architecture.

- Experience with SCADA operations, Power BI, AWS Glue, Lambda, SNS, and SQS.

- Understanding of security, compliance, and regulatory requirements like SOX and cybersecurity best practices.



Enable Skills-Based Hiring
No






Furlough Notification
 




All NextEra Energy Contingent Workforce Program (CWP) assignments are eligible for worker furlough. Typical furlough schedules coincide with select national holidays, but may be subject to change. Suppliers will be notified by the CWP of those workers impacted and the applicable furlough dates prior to each furlough period.






Worker Building Location
 




JRA - James Robo A Building - (phone number removed)






Will driving be required as part of position duties/work?
 




No






Driving Record Validation
 




For all positions indicating driving requirements, supplier must hold validation of non-restricted current driver''s license and demonstrate the following: no alcohol/drug related driving offenses within the previous five years and/or the license is not currently suspended or restricted related to hours of driving or reason for driving.






Additional Job Details
 




(No Value)






Will Per Diem and Mob/De-Mob expense types be available for this requisition?
 




No






If Per Diem is available, please indicate the maximum amount:
 




0






If Nuclear Business Unit: On-Boarding Note
 




Nuclear workers requiring unescorted badge access will follow onsite in-processing procedures. All others will be required to complete Non-Nuclear pre-assignment screenings through their staffing supplier. Please contact CWP with any questions: (url removed) or (phone number removed).






Will the selected worker require unescorted badge access into Nuclear protected areas?
 




No






Is NERC CIP unescorted physical or cyber access required for this assignment?
 




No






Which NERC access is needed?
 




N/A






Is Federal Energy Regulatory Commission access required?
 




No





Attachment:



Job Posting: Database Analyst / Data Quality & Requirements Specialist (SQL + Excel)

About the Role

We're looking for a database-focused analyst with strong SQL (PostgreSQL) skills and a sharp eye for detail. This person is naturally curious, asks "why?” before "how?”, and isn't afraid to challenge assumptions to protect data integrity and improve processes. You'll manage, update, and audit databases, partner with stakeholders to gather requirements, and help ensure our data is accurate, reliable, and usable.

This is a great fit for someone who blends hands-on database skills with a business analyst mindset—someone who can independently clarify needs, translate them into data logic, and validate results end-to-end.



Key Responsibilities


  • Maintain, update, and audit relational databases (PostgreSQL), ensuring accuracy, consistency, and traceability.

  • Write and optimize SQL queries, including multi-table joins, aggregations, and validation checks.

  • Perform data quality checks, reconcile discrepancies, and document root causes and fixes.

  • Build and maintain Excel-based audit tools (pivots, lookups, Power Query as applicable) for reporting and verification.

  • Partner with internal users to gather requirements, challenge unclear requests, and translate business needs into data definitions and logic.

  • Create and maintain documentation: table definitions, field mappings, audit results, and change logs.

  • Support process improvements and controls around data updates, permissions, and governance.





Required Qualifications


  • Strong SQL experience, including PostgreSQL (or equivalent with ability to ramp quickly).

  • Solid understanding of relational database fundamentals (keys, constraints, normalization concepts, data integrity).

  • Demonstrated experience managing, updating, and auditing datasets/databases.

  • Advanced Excel skills (filters, pivot tables, XLOOKUP/VLOOKUP, data validation; Power Query is a plus).

  • Demonstrated experience performing data audits and documenting findings, remediation steps, and outcomes (repeatable + traceable work).

  • Comfortable owning ambiguous requests and independently questioning stakeholders to gather requirements and confirms data definitions before building.

  • Can explain logic and assumptions clearly to both technical and non-technical stakeholders.

  • Shows critical thinking, asks "why,” identifies risks and edge cases, and proposes better approaches rather than executing blindly.

  • Proven ability to investigate inconsistencies, ask probing questions, and validate assumptions.

  • Strong written and verbal communication; comfortable working directly with stakeholders.





Nice-to-Have Qualifications


  • Mechanical or engineering background/familiarity (or experience supporting engineering/asset-heavy environments).

  • Familiarity with AVEVA PI (PI System / PI Data Archive / PI Vision) or time-series data concepts.

  • Familiarity with IBM Maximo (asset management / work orders / equipment hierarchies).

  • Familiarity with AWS (RDS, S3, Athena/Glue, IAM concepts, basic cloud data patterns).

  • Experience acting as a BA: independently gathering requirements, defining acceptance criteria, mapping data sources to outputs.

  • Familiarity with tools like Quest (e.g., Toad) or data access/auditing tools.


Interview / Screening Questions (4)

1) SQL Join Question (cardinality + correctness)

Question:

You have two tables: A (a list of entities) and B (optional attributes about those entities). Some entities in A have no match in B.


  • Which join(s) would you use to return all entities from A and whatever exists in B?

  • How would you confirm you didn't accidentally drop records?


Follow-up: What result differences would you expect between INNER JOIN and LEFT JOIN in this scenario?



2) Auditing Question (data integrity mindset)

Question:

You're auditing an equipment status history table: equipment_status(equipment_id, status, status_start_time, status_end_time).

What checks would you run (and how) to detect:

  • overlapping status periods

  • missing/invalid equipment IDs (referential integrity)


(Look for: systematic checks, edge cases, "prove it with queries” approach.)



3) Critical Thinking Question (requirements + pushback)

Question:

A stakeholder asks: "Give me a list of active assets with bad data by tomorrow.”

What clarifying questions do you ask before writing SQL, and what would your acceptance criteria be?

(Look for: defining "active,” defining "bad data,” identifying source of truth, scope, timeframe, intended use, and validation plan.)



4) Experience Deep Dive (real example + hardest issue)

Question:

Tell me about a time you built a database/table/reporting dataset or ran a formal audit.


  • What was the goal and data sources?

  • What did your validation approach look like (row counts, tie-outs, spot checks, etc.)?

  • What was one of the biggest issues you faced (e.g., duplicate amplification, unclear definitions, missing keys, out-of-order timestamps), and how did you resolve it?


(Look for: concrete story, ownership, rigor, and lessons learned—not vague "we fixed it.”)

 

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application