wcgos / people-culture-analytics
Module 11

People and Culture Analytics

Position in the System

Module 11 is the workforce intelligence layer of the operating system. It applies the same measurement discipline to people that Module 3 applies to operations, Module 5 applies to clients, and Module 8 applies to capital.

Module 11 receives its primary upstream signal from Module 1's Leadership DNA Radar. Leadership behavioral scores become the benchmark against which employee engagement data is interpreted. If leaders score red on psychological safety, Module 11 adds psychological safety questions to pulse surveys and increases DEI scorecard weight from 20% to 40%. The logic: if leaders are not creating safe environments, the system needs more data on the impact, not less.

Module 11 also receives data from Module 10 (Change Enablement Sprint). Adoption rates, time-to-competency, and process compliance metrics from every change initiative feed into the workforce health dashboard. Repeated adoption failures in a specific team surface a management signal, not a training signal.

Downstream, Module 11's turnover risk data feeds into Module 8 (Agile Capital Allocation) for workforce planning tied to project funding. Module 11's engagement data feeds back into Module 1 during quarterly recalibration, where it becomes a data point in the next heat-map assessment.

Module 11's turnover risk model is also an AI deployment governed by Module 7 (AI Deployment Canvas), with quarterly bias audits to ensure that risk scores do not reflect demographic patterns rather than genuine turnover risk.

A standalone people analytics program measures workforce health. Module 11 measures workforce health and routes the data into capital planning, leadership diagnostics, change management, and AI governance.

Why People Analytics Underperforms

Most companies manage their most expensive asset with less data rigor than they apply to inventory or marketing spend.

Pattern 1: Annual surveys arrive too late. The engagement survey runs in October. Results are compiled by December. Action plans form in January. The employee who was disengaged in October left in November. The survey detected the problem two months after the person was gone.

Pattern 2: DEI as checkbox. The company tracks representation numbers and publishes them annually. The numbers do not change because they are not connected to accountability. Nobody's performance evaluation includes DEI metrics. Nobody's budget is affected by DEI outcomes. The tracking is performative.

Pattern 3: Reactive retention. The company learns about turnover risk during the exit interview. By then, the decision is made. The counter-offer comes too late or addresses the wrong issue. The company spends months replacing someone it could have retained with a modest intervention three months earlier.

Module 11 addresses all three patterns through pulse surveys (real-time, not annual), DEI scorecards tied to leadership accountability (consequential, not performative), and turnover risk models that identify at-risk employees before they decide to leave (proactive, not reactive).

Engagement Pulse Surveys

What they do

Short, frequent surveys (weekly or bi-weekly) with 3 to 5 questions that measure employee engagement in real time. They replace the annual engagement survey that is outdated by the time results are compiled.

Five dimensions measured

Manager effectiveness. Workload sustainability. Growth and development satisfaction. Psychological safety. Alignment with company direction.

Response rate target

Above 80%. Below this threshold, results are unreliable and action based on them risks misdiagnosis. If response rates drop below 80%, Module 10 (Change Enablement Sprint) activates a survey adoption initiative: communicate why the survey matters, show what changed because of previous feedback, and reduce friction in the response process.

Action cadence

Results reviewed by team leads within 48 hours. Trends surfaced to leadership monthly. This speed is the point. A weekly pulse with a 48-hour review cycle means that a team experiencing a sudden engagement drop is identified and supported within days, not months.

How Module 1 calibrates the surveys

When Module 1's Leadership DNA Radar shows red on psychological safety, Module 11 adds psychological safety questions to every pulse survey cycle. The standard rotation might include psychological safety once per month. Under a red score, it appears every week until the leadership team demonstrates improvement.

When Leadership DNA shows red on inclusive decision-making, Module 11 increases DEI scorecard weight and adds inclusion-specific questions to the pulse. The system increases measurement intensity in the areas where the diagnostic identified risk.

Turnover Risk Models

What they do

Predictive models that forecast employee turnover by analyzing multiple input variables and producing a risk score per employee (Green, Amber, or Red), updated monthly.

Input variables

Tenure and time since last promotion. Engagement pulse scores (trending direction, not just current level). Manager change frequency (multiple manager changes in a short period is a risk signal). Compensation relative to market benchmarks. Performance review trajectory (improving, stable, or declining). PTO usage patterns (sudden changes can signal disengagement or burnout).

Intervention triggers

Amber: Manager 1:1 within two weeks. Development conversation focusing on career trajectory and satisfaction. The goal is to understand and address the concern before it becomes a decision to leave.

Red: Skip-level conversation with the manager's manager. Retention package review. Role adjustment exploration. The goal is to demonstrate that the organization values the individual and is willing to invest in retaining them.

Validation

Compare predicted risk against actual departures quarterly. This tunes the model over time. If the model flags 50 employees as red and only 5 actually leave, the model is too sensitive. If 30 of 50 leave, the model is well-calibrated. This validation cycle is part of Module 7's quarterly audit framework because the turnover risk model is an AI deployment.

Connection to Module 7

The turnover risk model uses the same bias audit methodology as Module 7's quarterly AI audit. The bias review splits risk scores by demographic group, tenure band, and department to ensure the model is identifying actual risk factors and not proxying for demographics. If bias is detected, the model returns to Module 7's Stage 2 (MVP Pilot) for retraining.

This is a governance connection that standalone people analytics tools do not have. Most HR analytics platforms deploy a turnover model and update it annually at best. Module 11's model is subject to the same quarterly audit rigor as every other AI deployment in the operating system.

DEI Scorecards

What they track

Representation by level (entry, mid, senior, executive) across demographic dimensions. Hiring pipeline diversity at each funnel stage. Promotion velocity by demographic group. Pay equity ratios. Inclusion index from pulse surveys.

Reporting cadence

Monthly dashboard update. Quarterly board report. The cadence matters because DEI metrics that update annually create no urgency. Monthly updates create visibility. Quarterly board reports create accountability.

Accountability mechanism

Each department head owns their team's DEI scorecard metrics. When Module 1's Leadership DNA shows red on inclusive decision-making, Module 11 increases the DEI scorecard weight in the leadership team's performance evaluation from 20% to 40%. This connects the behavioral diagnostic to measurable consequences.

Benchmarking

Compare against industry benchmarks and year-over-year internal progress. Both comparisons matter. Industry benchmarks show where the company stands relative to peers. Year-over-year trends show whether interventions are working.

Live People Health Dashboard

What it shows

The People Health Dashboard aggregates all Module 11 data into a single view, consistent with the traffic-light system used across every VWCG OS module.

Dashboard panels: Headcount and open positions by department. Engagement pulse trend lines. Turnover risk heat map by team. DEI scorecard summary. Time-to-fill for open roles. Training completion and competency scores (from Module 10). Absenteeism and PTO utilization trends.

Access levels

HR leadership sees the full dashboard. Managers see their team's data. Executives see the rollup view. Privacy is maintained by showing aggregate data at every level except direct manager, who can see individual risk scores for their direct reports only.

Alert automation

The dashboard uses Module 3's traffic-light thresholds and variance alert logic. When an engagement pulse drops below the amber threshold for a specific team, the alert routes to the manager and HR business partner. When turnover risk flags a red employee, the skip-level conversation protocol activates.

Connection to Module 8

Workforce planning data from the People Health Dashboard feeds directly into Module 8's Capital Governance Forum. If a funded project requires hiring and Module 11's time-to-fill data shows the hiring pipeline is slow, Module 8 adjusts the project timeline rather than pretending the headcount will materialize on schedule. This connection prevents the common failure of project plans that assume instant hiring in a competitive market.

What Makes This Different from Standard People Analytics

Standard people analytics platforms (Visier, Lattice, Culture Amp, Workday People Analytics) provide excellent data collection, visualization, and reporting. They are designed as standalone HR tools with their own dashboards, their own surveys, and their own analytics engines.

Module 11 is integrated into the operating system in four ways that standalone platforms are not.

First, the engagement and behavioral baselines come from Module 1's diagnostic, not from a separate HR assessment. The Leadership DNA Radar tells Module 11 where to focus measurement before a single pulse survey is sent.

Second, the turnover risk model is governed by Module 7's AI framework, including quarterly bias audits and a staged deployment pipeline. Most people analytics platforms deploy their own models with their own update schedules. Module 11's model follows the same governance rigor as every other AI deployment in the system.

Third, adoption and change data flows in from Module 10. Workforce health is not just engagement and retention. It includes the organization's ability to absorb change, measured through adoption rates and process compliance across every Module 10 sprint.

Fourth, the people data feeds into Module 8's capital planning and Module 1's quarterly recalibration. Workforce health is not an HR metric. It is an operating system input that affects funding decisions and diagnostic routing.

Who This Module Is For

Module 11 was designed for mid-market companies that acknowledge people are their most important asset but manage them with less rigor than they manage inventory, marketing spend, or sales pipeline.

These companies have HR teams. They may have engagement surveys. What they lack is the connection between people data and the rest of the business: capital allocation, change management, leadership development, and AI governance. Module 11 creates those connections.

See How the VWCG OS Connects Diagnostics to Execution
Request a Working Session