
AECOM Behavioral Interview for Engineering — Product Designer
This behavioral interview reflects common AECOM practices and candidate reports: a structured, STAR-driven conversation emphasizing client impact, collaboration in a global matrix, safety/ethics/compliance, and digital excellence applied to infrastructure. It is tailored for an Engineering Product Designer building internal and client-facing digital tools (e.g., planning/design platforms, data/GIS/BIM-integrated solutions) that support transportation, buildings, water, energy, and environment programs. Format and flow (approximate): - 0–5 min: Introductions, role context, AECOM purpose (delivering a better world), and expectations for STAR answers. - 5–45 min: 6 core behavioral prompts with targeted follow-ups (see below). - 45–55 min: Short scenario about trade-offs in a regulated/public-sector setting (values, compliance, and stakeholder alignment). - 55–60 min: Candidate questions; alignment on EDI, sustainability/ESG commitments, and global teaming. Focus areas specific to AECOM: 1) Client and stakeholder management in regulated environments - How you navigate public-sector procurement constraints, evolving scope, and change control with PMs and discipline leads. - Sample: “Tell me about a time you aligned diverse stakeholders (engineers, client PMO, legal) around a product decision under tight deadlines. How did you document the decision trail?” 2) Collaboration in a global, multi-disciplinary matrix - Working across time zones with engineers, planners, UX researchers, data/GIS/BIM teams, and construction managers; clear handoffs and design ops. - Sample: “Describe a complex handoff where design decisions affected downstream engineering. What artifacts and rituals kept everyone aligned?” 3) Safety, ethics, inclusion, and compliance - Considering HSE principles, accessibility (e.g., WCAG/Section 508), privacy/security, and equity outcomes for communities. - Sample: “Give an example where you balanced usability with security or compliance requirements. How did you mitigate risk without degrading UX?” 4) Digital expertise for infrastructure contexts - Designing products that integrate datasets, GIS/BIM, or digital twins; working with legacy tools and stringent IT/security standards. - Sample: “Walk through a time you designed for highly technical users (e.g., engineers). How did you validate workflows and address domain-specific constraints?” 5) Outcome orientation and measurable impact - Adoption/change management, training, documentation, and KPIs (e.g., time saved in design review, error reduction, data quality, client satisfaction). - Sample: “Share a product you shipped that improved delivery outcomes. What metrics did you set pre-launch and how did they inform iteration?” 6) Ambiguity, judgment, and trade-offs - Reconciling policies, standards, and community impact; making principled trade-offs and communicating rationale. - Sample: “Describe a high-ambiguity discovery effort where policy/standards conflicted with user needs. How did you resolve it?” 7) Leadership without formal authority - Influencing SMEs and senior stakeholders; pushing back constructively; facilitating decisions. - Sample: “Tell me about a time you disagreed with a senior stakeholder on a release decision. What was your approach and outcome?” 8) Learning mindset and continuous improvement - Retrospectives, lessons learned, and scaling practices via design systems, playbooks, and templates. - Sample: “How have you institutionalized what you learned from a challenging engagement so future teams benefit?” Scenario segment (10 min example): - “A city client requests accelerating a release of a field-inspection product that integrates sensitive asset data. Security review is incomplete, and accessibility gaps remain. How do you proceed, who do you involve (PM, security, legal, client), what trade-offs do you make, and how do you communicate risk and mitigation?” Evaluation rubric (anchored to AECOM expectations): - 5 — Exemplary: Clear STAR narratives with cross-disciplinary influence; quantifies outcomes; anticipates compliance/safety/accessibility; shows inclusive design thinking; strong stakeholder alignment in complex, regulated contexts; communicates trade-offs and risk transparently. - 4 — Strong: Solid stories with evidence of impact and collaboration; minor gaps in metrics or governance details; good grasp of compliance and change management. - 3 — Competent: Relevant experiences and teamwork; limited evidence of measurable outcomes or depth in regulated/public-sector dynamics. - 2 — Developing: Vague answers; limited stakeholder or matrix-collaboration examples; light on risk/compliance thinking. - 1 — Weak: Unstructured, anecdotal responses; overlooks safety/ethics/compliance; minimal collaboration or measurable impact. Interviewer probes aligned to AECOM’s environment: - “How did you document decisions for auditability and handover to PMO/engineering?” - “What change-management tactics drove adoption (training, champions, job aids)?” - “How did you handle conflicts between client ‘must-haves’ and accessibility/security standards?” - “Which metrics mattered to public-sector clients vs. internal engineering teams, and how did you report them?” - “Describe how your design system supported consistency across global teams and disciplines.”
60 minutes
Practice with our AI-powered interview system to improve your skills.
About This Interview
Interview Type
BEHAVIOURAL
Difficulty Level
3/5
Interview Tips
• Research the company thoroughly
• Practice common questions
• Prepare your STAR method responses
• Dress appropriately for the role