Python
Pipeline Tools
A production-grade Python toolkit — Scene Validator & Asset Publisher — demonstrating the TD fundamentals studios care about: validation, versioning, USD authoring, and data-driven publish records.
* Core logic runs without Maya. PySide6 and usd-core are optional — both degrade gracefully.
// 01 — Overview
Two tools.
One cohesive pipeline.
Both tools are built to mirror real production workflows at VFX and games studios. They run fully standalone for demo purposes and connect seamlessly into Maya via a one-click shelf installer. A hiring manager can open the code and immediately recognise familiar patterns.
Scene Validator
scene_validator/scene_validator.py
Validates Maya scenes against configurable studio conventions before handoff. Runs checks on a background QThread — the UI never freezes on heavy scenes.
- Naming convention enforcement via configurable regex
- Texture file path integrity — all file nodes verified on disk
- Scale consistency — catches non-frozen transforms
- Missing texture warnings on shader materials
- Heavy geometry detection with configurable poly limit
- Auto-fix system — freeze scale, remap paths via file dialog
- JSON report export with SHA-256 checksums per file
- Live log panel + rotating file in ~/pipeline_logs/
Asset Publisher
asset_publisher/asset_publisher.py
Handles the full publish pipeline: versioning, folder creation, checksumming, USD layer generation, and optional Git commits — all from a single UI.
- Auto-versioning — scans existing versions, increments v001 → v002
- Structured folder hierarchy — type / name / dept / version
- JSON manifest — timestamps, artist, notes, tags, file hashes
- USD payload layer — full Usd.Stage or valid .usda stub without usd-core
- Git integration — auto commit + semantic version tag per publish
- Drag-and-drop file list with Maya scene injection button
- Threaded publish — UI stays live during copy and hash operations
// 02 — Checks
Scene Validator —
Check Reference
All thresholds and patterns are configurable at the top of the file — no hardcoded studio-specific values.
| Check | What It Catches | Severity | Auto-Fix? |
|---|---|---|---|
| Naming Convention | Objects not matching PascalCase_GEO / _CTRL / _JNT / _GRP / _MAT — enforced per node type via regex map |
Error | — human rename required |
| File Path | Missing or empty .fileTextureName attribute on every file node in the scene |
Error | ✦ file dialog remap |
| Scale Consistency | Scale ≠ (1.0, 1.0, 1.0) within configurable tolerance — non-frozen transforms break downstream rigs and USD exports | Error | ✦ makeIdentity(scale=True) |
| Missing Textures | Shader materials with no upstream texture node connections — catches placeholder shaders before lighting handoff | Warning | — manual review |
| Heavy Geometry | Meshes exceeding configurable face-count limit (default 50,000) — surfaces crowd assets and accidental detail imports | Warning | — artist decision |
// 03 — Code
Key implementation
patterns
# ValidationResult carries an optional fix_fn callable. # When present, the UI enables per-row and bulk Auto-Fix buttons. class ValidationResult: PASS = "PASS" WARNING = "WARNING" ERROR = "ERROR" def __init__(self, check, node, status, message, fix_fn=None): self.fix_fn = fix_fn # callable → activates Auto-Fix in UI # Scale check — flags non-frozen transforms, attaches one-click fix def _check_scale_consistency(self): for mesh in cmds.ls(type="mesh", long=True) or []: transform = cmds.listRelatives(mesh, parent=True, fullPath=True)[0] sx, sy, sz = (cmds.getAttr(f"{transform}.scale{ax}") for ax in "XYZ") if any(abs(v - 1.0) > SCALE_TOLERANCE for v in (sx, sy, sz)): self.results.append(ValidationResult( check = "Scale Consistency", node = transform, status = ValidationResult.ERROR, message = f"Non-frozen scale: ({sx:.3f}, {sy:.3f}, {sz:.3f})", # lambda captures transform at definition time fix_fn = lambda n=transform: cmds.makeIdentity(n, apply=True, scale=True) ))
def _sha256(filepath: str) -> str: h = hashlib.sha256() with open(filepath, "rb") as f: for chunk in iter(lambda: f.read(8192), b""): h.update(chunk) return h.hexdigest() # Manifest.write() — produces a versioned JSON record for every publish self.data = { "schema_version": "1.0", "asset_name": asset_name, "version": version, # e.g. "v003" "published_by": artist, "published_at": datetime.utcnow().isoformat() + "Z", "tags": tags, "files": [{ "filename": p.name, "size_bytes": p.stat().st_size, "sha256": _sha256(src), # integrity check on every file "original_path": str(src), } for src in source_files if (p := Path(src)).exists()], "status": "published", }
def write_usd_stub(publish_path: Path, asset_name: str, geo_files: list[str]): if USD_AVAILABLE: # full OpenUSD path stage = Usd.Stage.CreateNew(str(publish_path / "payload.usda")) root = UsdGeom.Xform.Define(stage, f"/{asset_name}") Usd.ModelAPI(root).SetAssetName(asset_name) for i, gf in enumerate(geo_files): stage.DefinePrim(f"/{asset_name}/geo_{i:02d}").GetReferences().AddReference(gf) stage.GetRootLayer().Save() else: # plain-text stub — valid USD syntax stub = f"""#usda 1.0 def Xform "{asset_name}" (kind = "component") {{ def Scope "geo" {{}} }}""" (publish_path / "payload.usda").write_text(stub)
Published Folder Output
// 04 — Install
Getting started
Clone the repository
Copy the pipeline_tools folder to your scripts directory or a shared location on your pipeline NFS.
git clone https://github.com/AlyBarr/pipeline-tools cd pipeline-tools
Install optional dependencies
PySide6 powers the UI. usd-core enables full USD stage authoring. Both fall back gracefully if absent — Maya 2022+ already ships with PySide2.
pip install PySide6 # UI — skip if using Maya 2022+ pip install usd-core # Full USD authoring (optional)
Run standalone — no Maya required
Both tools launch in demo mode with realistic mock data when Maya is not present. Use this to evaluate the UI and report output.
python scene_validator/scene_validator.py python asset_publisher/asset_publisher.py
Maya — install to shelf
Paste the shelf installer into Maya's Script Editor (Python tab) once. Updates the TOOLS_DIR path at the top first.
# Update TOOLS_DIR in maya_shelf_installer.py first exec(open("/path/to/pipeline_tools/maya_shelf_installer.py").read())
// 05 — Skills
Built to signal
production readiness
Every design decision mirrors conventions at major VFX and games studios. A senior TD or hiring manager should recognise familiar patterns immediately.
cmds, listRelatives, getAttr, polyEvaluate, makeIdentity, and material graph traversal via listConnections.
QMainWindow, QTreeWidget, QFormLayout, QFileDialog, QPlainTextEdit log panel, custom QTreeWidgetItem subclassing, full dark stylesheet theming.
Signal / Slot communication back to the main UI thread — UI never freezes regardless of scene size.
Usd.Stage creation, UsdGeom.Xform, Usd.ModelAPI asset metadata, prim-level reference authoring, and layer saving — plus a valid plain-text .usda stub fallback when usd-core is not installed.
git init / add / commit / tag with semantic version strings per publish — e.g. HeroCharacter-model-v003. Designed to bolt onto an existing studio Git workflow.
schema_version field so downstream tools can evolve the schema without breaking readers. Fields cover identity, provenance, file integrity, and publish status.
try / except ImportError with boolean flags. The tools always run — demo mode with mock data if Maya is absent, text-stub USD if usd-core is missing.