The Vendor Portal Trap — Why Equipment Data Still Flows Through Spreadsheets, Emails, and Disconnected Silos
The EPC industry has a $400M coordination problem hidden in plain sight. Every major construction project relies on dozens of equipment vendors to deliver engineered packages — and the data exchange between contractors and vendors hasn't fundamentally changed in twenty years.
The Invisible Cost of Equipment Data Exchange
On a typical $400M FPSO construction contract, equipment procurement accounts for 40–55% of total project cost. That's $160M–$220M flowing through purchase orders to equipment vendors — pressure vessels, heat exchangers, pumps, compressors, valve packages, instrument systems. Each vendor generates hundreds of deliverables: datasheets, GA drawings, stress calculations, 3D models, material certificates, test reports.
The data inside those deliverables is precise, machine-readable, and rich. A DEXPI file from a P&ID contains every equipment tag, nozzle, instrument connection, and process parameter. A STEP model contains exact geometry. A PCF file contains every weld joint, flange, and material specification.
Then someone opens Excel.
A vendor engineer manually transcribes their design data into the contractor's proprietary spreadsheet template. Field by field. Row by row. 40+ hours per month, per supplier, per project. Not because the data doesn't exist in a machine-readable format — but because nobody has built a system that reads it.
The contractor's IAM team then manually checks that spreadsheet against their engineering standards. Another 12–20 hours per week. They find errors, flag them by email, wait for corrections, re-check. The cycle repeats through 3–5 revision rounds per equipment tag.
Multiply this by 30–50 vendors on a single project. The hidden labour cost of equipment data exchange — across both sides — runs into millions of euros per year, on every project, across the entire industry.
Nobody tracks it. Nobody budgets for it. Everybody simply accepts it as "how things work."
Five Broken Patterns That Every Project Repeats
1. The Excel Template Maze
Every EPC contractor has their own equipment datasheet template. Shell's looks different from TechnipFMC's. SBM Offshore's format has cells in different locations from Saipem's. The same vendor — manufacturing the same separator — fills in a different spreadsheet for each contractor. Different column layouts. Different naming conventions. Different validation expectations.
The data is identical. The work is doubled. And the risk of transcription error is multiplied.
2. The Revision Mismatch
Engineering revisions are the hidden schedule killer in equipment data exchange. A vendor submits Rev A of their datasheet. The contractor reviews it, issues comments. The vendor updates the PDF document to Rev B — but forgets to update the Excel data to match. Or updates some fields but not others. Now the document says "Design Pressure: 125 barg" but the data cell still reads "115 barg."
VENDOR SUBMITS:
├── Datasheet PDF ──── Rev B ──── Design Pressure: 125 barg ✅
├── Excel Data ──── Rev A ──── Design Pressure: 115 barg ❌ STALE
├── GA Drawing ──── Rev B ──── Nozzle N3: 8" 600# ✅
└── STEP Model ──── Rev A ──── Nozzle N3: 6" 300# ❌ STALE
CONTRACTOR IAM TEAM:
"Which one is correct? The PDF or the Excel?"
"Let me email the vendor and ask."
"That was Tuesday. They haven't responded yet."
The root cause is always the same: the document and the data are maintained independently. When a vendor updates a drawing, the engineering change should flow automatically into the structured data. Today it doesn't. Manual re-entry creates manual mismatches.
3. The Email Black Hole
Review comments, technical queries, discrepancy flags — they all travel by email. The IAM engineer sends 15 comments on a pressure vessel datasheet. The vendor responds to 12 of them. Three fall through. Six weeks later, during final data handover, those three unresolved comments surface again.
February: IAM → Vendor: "15 comments on Tag 21-V-1001 datasheet"
March: Vendor → IAM: "12 comments resolved, see attached Rev B"
April: (Silence — both sides think it's done)
July: IAM → Vendor: "Why is corrosion allowance still 3mm?
Your NACE MR0175 assessment says 6mm."
August: Vendor: "I never received that comment."
There is no shared issue register. No status tracking. No audit trail. Every unresolved comment is a potential quality non-conformance waiting to surface during commissioning.
4. The Decoupled Data-Document Problem
Procurement teams track deliverables via a Vendor Document List (VDL) — a spreadsheet that records whether a document was received. Not whether the data inside is correct. The VDL says "Datasheet: Received ✅". But the data in that datasheet has three critical errors that nobody caught because nobody systematically validated it.
PROCUREMENT VIEW: REALITY:
Tag 21-V-1001 Tag 21-V-1001
├── Datasheet ✅ Received ├── Datasheet ✅ Received
├── GA Drawing ✅ Received │ └── 3 data errors (undetected)
├── Stress Calc ❌ Pending ├── GA Drawing ✅ Received
├── STEP Model ✅ Received │ └── Nozzle mismatch vs datasheet
└── Test Report ❌ Pending ├── Stress Calc ❌ Pending
├── STEP Model ✅ Received
"We're 60% complete on this tag." │ └── Wrong revision (Rev A, not B)
└── Test Report ❌ Pending
"We're 60% complete... but 40% of
what we received is wrong."
Procurement closes purchase orders based on document count. Data quality is an afterthought — discovered months later, during system completion or (worse) during commissioning.
5. The Single-Direction Paradigm
Every existing "supplier portal" or data exchange tool follows the same architecture: it is owned by the contractor and configured for the contractor. The vendor is an invited guest, permitted to fill in forms and upload documents. The vendor gets no agency, no cross-project visibility, no tools to manage their own workload.
This is the fundamental flaw of the "supplier portal" model. It serves one direction only.
The Vendor Lock-In Problem Nobody Mentions
A typical equipment vendor — say, a pressure vessel manufacturer in Korea or a valve package supplier in Italy — works with 3–8 different EPC contractors simultaneously. Each contractor has their own:
- Data submission platform (or email, or SharePoint, or nothing)
- Excel template format (87 fields here, 112 fields there)
- Engineering standards (Shell DEPs vs. TechnipFMC GWPs vs. SBM internal)
- Naming conventions (Tag format:
21-V-1001vsAAAA-NN-TT-NNNN) - Review workflow (formal transmittal vs. email thread)
- Quality expectations (what's "critical" vs. "nice to have")
The vendor maintains five separate login portals, five different Excel formats, five sets of submission rules — for what is ultimately the same engineering data expressed in different packaging.
Every Monday morning, the vendor's document controller opens five different systems, downloads five different templates, populates them with data that already exists in the vendor's own design system, and uploads them back — often re-entering the same equipment data five times.
This is vendor lock-in by friction. Not because any single portal is technically superior — but because the switching cost of learning, configuring, and maintaining yet another contractor-specific system is too high. Vendors tolerate it because they have no alternative.
The question nobody asks: What if the vendor only had to manage their data once?
The Classification Chaos — Ten Standards, Zero Alignment
Equipment classification should be the simplest part of the process. A centrifugal pump is a centrifugal pump. But in practice, the same piece of equipment is classified differently depending on which standard you reference, which contractor you're working for, and which part of the world the project sits in.
The Standards Landscape
| Standard | Classification | What It Requires |
|---|---|---|
| CFIHOS | Heat Exchanger → Shell & Tube | 47 mandatory data fields |
| UNSPSC | 40101500 (Heat exchangers) | Generic category code |
| ISO 14224 | Rotating/Static → HX → S&T | Reliability boundary definitions |
| EqHub Norm | Product class with NORSOK-aligned requirements | NCS-specific verification rules |
| Shell DEP | Custom engineering practice references | Shell-specific technical requirements |
| TEMA | Type designation (AES, BEM, etc.) | Design and fabrication standards |
Six different classification schemes. Six different field requirements. Six different validation expectations. For the same piece of equipment on the same project.
The result is that vendors don't know what "compliant" means anymore. Compliance is contractor-specific, project-specific, and sometimes even discipline-specific within the same project. There is no single source of truth for "what data do I need to provide for this equipment class?"
Disconnected Islands: Where Equipment Data Goes to Die
The equipment data lifecycle in a typical EPC project crosses at least six disconnected systems, none of which talk to each other.
Each system holds a fragment of the truth. The vendor's AVEVA model knows every nozzle size and location. The email thread contains the review comments. SharePoint stores the PDF. The data team maintains the Excel check. Procurement tracks whether something arrived. The 3D review happens in Navisworks.
Nobody has the complete picture.
When the commissioning team asks "give me the as-built data for Tag 21-V-1001," someone has to manually stitch together fragments from six systems, across three departments, covering 18 months of correspondence. That stitching exercise — for every tag, on every project — is the real cost of disconnection.
The Industry Registry Gap
Industry-funded platforms like EqHub (operated by Collabor8 / Offshore Norge) have made enormous progress in creating a centralized product registry — a single, verified source of vendor documentation with standard product IDs, certified verification workflows, and compliance with NORSOK, ISO 15926, and CFIHOS.
EqHub answers the question: "What equipment exists?" It provides standardized product data, documents, and drawings in a verified, traceable format. Over 100,000 products registered. Free for all users — funded by NCS operators.
But EqHub, by design, does not answer the project execution questions:
| What EqHub Solves | What EqHub Doesn't Solve |
|---|---|
| "Is this valve type registered?" | "Has the vendor submitted Rev B of the datasheet for Tag 21-V-1001 on Project Alpha?" |
| "What are the EqHub Norm requirements for this product class?" | "Does the submitted data comply with Contractor X's specific engineering standards?" |
| "What is the verified product data?" | "Is the equipment delivery scope 60% or 75% complete?" |
| "Is this product certified by a verified registrant?" | "What are the 12 open discrepancies on this equipment package?" |
EqHub is the library. What's missing is the project execution layer — the system that takes vendor deliverables, validates them against contractor-specific standards, tracks scope completion, and manages the issue resolution cycle. That layer sits between the vendor's design tools and the contractor's project controls. Konnect Equipment Hub is built to fill this gap — connecting to EqHub's product registry while providing the project-level automation that EqHub, by design, does not.
What a Vendor Actually Experiences Today
Let's follow a pressure vessel vendor — ABC Fabricators in Ulsan, Korea — through a single equipment data submission on a single project.
Week 1: Receive Requirements
ABC Fabricators receives a 200-page Scope of Work from SBM Offshore for three separation vessels. The SOW references:
- SBM Engineering Standards Rev 3
- NORSOK Z-CR-002 data requirements
- EqHub product registration requirement
- SBM's proprietary Excel datasheet template (v14.2)
- A vendor document list (VDL) specifying 14 deliverables per tag
Week 2–4: Design and Generate Data
ABC's design team works in AVEVA E3D and PV Elite. They produce:
- STEP 3D models (precise geometry, every nozzle, every support)
- Stress calculation reports (PDF + native files)
- GA drawings (AutoCAD DWG → PDF)
- Material data reports
- Test procedures
All of this data exists in machine-readable, structured formats inside the vendor's own tools.
Week 5: The Manual Transcription
Now comes the pain. ABC's document controller opens SBM's Excel template and begins manual data entry:
Cell B7: Tag Number → types "21-V-1001"
Cell B12: Design Pressure → types "125" (switches to Cell C12 for units "barg")
Cell B15: Design Temperature → types "120" (Cell C15: "°C")
Cell C22: Material Grade → types "SA516 Gr.70"
Cell D30: Corrosion Allowance → types "3.0" (Cell E30: "mm")
...
(86 fields, across 4 tabs, for EACH of 3 tags = 258 manual entries)
Every single data point already exists in the AVEVA model and PV Elite calculation. The document controller is re-typing information from one system into another because no integration exists.
Time spent: 3 days per tag × 3 tags = 9 working days.
Week 6: Submit and Wait
The completed Excel files, PDFs, and STEP models are uploaded to SBM's platform (or emailed, or pushed to SharePoint). ABC opens a different browser tab and does the same for their TechnipFMC project. Different template. Different fields. Same equipment data.
Week 8: Comments Arrive
SBM's IAM team returns 22 comments across the three tags. Eleven are data quality issues that could have been caught automatically ("design temperature missing units," "nozzle orientation inconsistent with GA drawing," "material grade format non-standard"). Six are substantive engineering queries. Five are about document formatting.
ABC spends another 4 days resolving comments, updating the Excel, re-exporting PDFs, and re-submitting.
Week 12: Rev B
Engineering issues a design change. Nozzle N3 changes from 6" 300# to 8" 600#. ABC updates AVEVA, regenerates the STEP model, updates the GA drawing, recalculates stress — all within their native tools. Then manually updates 14 cells across 3 tabs in SBM's Excel template. Misses two cells. Those errors won't be caught until Week 20.
This cycle — design → transcribe → submit → review → comment → correct → re-submit — repeats 3 to 5 times per tag, on every project, for every vendor.
The Bi-Directional Model — A Hub, Not a Portal
The word "portal" reveals the problem. A portal has an owner and visitors. The contractor owns it, configures it, controls it. Vendors are guests. They enter through the contractor's door, fill in the contractor's forms, follow the contractor's rules, and leave.
A hub is different. A hub has participants. Both sides bring something. Both sides get value.
Use Case 1: Contractor → Vendors
The contractor sets their engineering standards, invites their equipment vendors, and receives validated data submissions. The platform automatically checks every submission against the contractor's standards library — field completeness, naming conventions, value ranges, cross-field dependencies. The IAM team reviews validation results, not raw data.
Use Case 2: Vendor → Multiple Contractors
The vendor uploads their native engineering files (DEXPI, STEP, PCF) once. The hub automatically extracts structured tag data, adapts it to each contractor's required format and standards, and flags compliance issues per contractor. The vendor manages all their projects — across SBM, Shell, TechnipFMC, McDermott — from a single workspace.
No existing platform serves Use Case 2. Every "supplier portal" in the market today is contractor-owned and contractor-configured. The vendor experience is an afterthought. Konnect Equipment Hub is designed from day one to serve both directions — giving vendors a single workspace while giving contractors automated validation against their own standards.
From Digitization to Automation — The Real Shift
The current generation of equipment data platforms — ShareCat, Aconex, various SharePoint-based solutions — digitized the process. They moved the Excel template from a file share to a web form. The vendor now types into a browser instead of a desktop spreadsheet. The underlying work is unchanged: manual data entry from native files into contractor-defined templates.
Digitization is not automation.
Automation means:
Ingest — Read the vendor's native engineering file (DEXPI XML, STEP AP242, PCF) and extract structured tag data automatically. No manual transcription.
Validate — Run extracted data against the contractor's standards library — field completeness, value ranges, naming conventions, cross-field dependencies, unit consistency — in seconds, not days.
Track — Monitor scope completion at the deliverable level. Not "was the document received?" but "is the data inside correct, complete, and compliant?"
Resolve — Flag discrepancies as trackable issues with owner, deadline, and audit trail. No more email black holes.
Adapt — Apply different contractor standards to the same vendor data. The vendor uploads once; the platform validates against each contractor's rules independently.
The ROI Is Measurable
| Manual (Today) | Automated (Alternative) | |
|---|---|---|
| Vendor data entry per tag | 3 days | 15 minutes |
| IAM review per tag | 4–6 hours | 30 minutes (review results) |
| Transcription errors | 5–15 per tag | 0 (extracted from source) |
| Revision sync time | 2 days | Automatic |
| Comment resolution cycle | 2–4 weeks | 2–3 days (tracked issues) |
Per-supplier annual savings:
- Vendor side: 40 hrs/month × €75/hr = €36,000/year
- IAM side: 15 hrs/month × €100/hr = €18,000/year
- Combined: €54,000/year per supplier
- × 30 suppliers per project = €1.6M/year in labour savings alone (before rework reduction, NCR avoidance, and schedule gains)
What Changes for Your Project
For the IAM Engineer
Before: You spend 60% of your time checking Excel cells against engineering standards. You compare datasheet values against NORSOK Z-CR-002 requirements, cross-check tag naming against your contractor's conventions, verify that material grades match what the stress calculation specifies. You flag 15 issues per tag, write them in an email, send them to the vendor, and wait. Two weeks later, you do it again for Rev B.
After: The vendor uploads their DEXPI file. The platform extracts 86 data fields automatically, runs them against your standards library, and presents a validation report: 3 critical issues (missing MDMT value, corrosion allowance below minimum, nozzle schedule mismatch with GA). You review the report, confirm the flags, and the vendor sees them instantly with clear descriptions and references to the applicable standard clause. Resolution is tracked. You review results, not raw data.
For the Vendor Document Controller
Before: You maintain five different Excel templates for five different contractors. Every engineering change means manually updating cells in each template. You spend Monday mornings downloading templates, Tuesday through Thursday populating them, and Friday uploading and tracking submissions. If a revision slips through in one template but not another, the error surfaces weeks later as a formal NCR.
After: You upload your engineering files — the same DEXPI, STEP, and calculation files your design team already produces. The hub extracts data, formats it per each contractor's requirements, and shows you exactly what each contractor will see. When engineering issues a revision, you upload the updated file and the platform automatically detects what changed — "Nozzle N3: 6" 300# → 8" 600#, Design weight: 42,000 kg → 45,000 kg" — across all projects simultaneously.
For the Procurement Lead
Before: Your VDL tells you that 43 of 72 deliverables are received. 60%. But it doesn't tell you that 8 of those 43 contain data errors, 3 have revision mismatches, and 2 reference obsolete engineering standards. The "60% complete" number is a document count, not a data quality measure. You close the PO milestone based on quantity. The quality problems become someone else's problem.
After: Scope completion is measured at three levels: documents received, data validated, issues resolved. Your dashboard shows: 43 documents received (60%), 35 data-validated (49%), 31 issues-resolved (43%). The gap between "received" and "validated" is your data quality debt — visible, quantified, and actionable before you sign off on the PO milestone.
The Open Exchange Vision
The long-term future isn't just better portals. It's an ecosystem where equipment data flows as freely as financial data flows through banking APIs.
How Standards Align
Instead of each contractor reinventing their own classification scheme, the platform integrates with established industry standards:
| Layer | Source | Role |
|---|---|---|
| Product identity | EqHub product IDs (TEK/SPC) | "This is the same valve in every project" |
| Data requirements | EqHub Norm + CFIHOS + NORSOK Z-CR-002 | "These fields are required for this equipment class" |
| Classification | UNSPSC + ISO 14224 | "This is a centrifugal pump, category 40141602" |
| Contractor rules | Per-contractor standards library | "SBM also requires field X with format Y" |
Vendors register their products once in the industry registry. Project-specific data — tag numbers, process conditions, site-specific requirements — lives in the execution platform. The two connect via API, not via manual cross-referencing.
Network Effects
The most powerful aspect of a hub model is the network effect. When Contractor A invites 10 vendors, those vendors are now on the platform. When Vendor X also works with Contractor B, they bring the platform with them — "We already use this for Contractor A." Contractor B evaluates it, sees the automation value, and adopts.
Year 1: 3 contractors × 15 vendors each = 45 vendor accounts
Year 2: 30% vendor overlap → 6 contractors × 25 vendors = 150 accounts
Year 3: Network gravity → new contractors join because their vendors already use it
The switching cost flips. Today, vendors are locked into each contractor's proprietary portal. In a hub model, the ecosystem becomes the standard — and the cost of NOT being on the platform grows with every new participant.
Built for the Supply Chain That Builds the World
The industries that build the world's most complex physical assets — FPSOs, offshore platforms, LNG terminals, refineries, chemical plants — depend on hundreds of equipment vendors delivering precise engineering data under extraordinary schedule pressure. The current model of spreadsheets, emails, and disconnected portals has persisted for decades — not because it works, but because nobody built the alternative.
The alternative requires three things that no single existing solution provides:
- Native file automation — read engineering files, don't re-type them
- Bi-directional value — serve both contractors AND vendors
- Standards integration — connect to industry registries, not replace them
The tools exist. The standards exist. The engineering files exist. The only thing missing is the platform that connects them — an open, neutral hub that turns fragmented equipment data exchange into an automated, validated, trackable process.
The question isn't whether this transformation will happen. It's whether your next project will still be running on spreadsheets when it does.
Built by engineers who've filled in five different Excel templates on Monday morning, tracked vendor comments across six email threads, and spent weeks stitching together equipment data from disconnected systems. We built the platform we wished we'd had — explore Konnect Equipment Hub.