Document Purpose: Complete guide to XBRL conformance suites, testing strategies, and quality assurance
Last Updated: January 2026
Target Audience: XBRL processor developers implementing testing frameworks
Conformance suites are comprehensive collections of test cases that validate whether an XBRL processor correctly implements XBRL specifications. They are normative - meaning passing all tests is required to claim conformance with the specification.
Key Facts:
Your Goal: Update your 2014-2018 processor to pass current conformance suites for all supported specifications.
Definition:
A conformance suite is a normative collection of test cases that define correct behavior for an XBRL processor implementation.
Components:
Normative Status:
Conformance suites are normative, meaning:
1. Specification Clarification
2. Quality Assurance
3. Interoperability
4. Certification
1. Reliability
2. Interoperability
3. Regulatory Compliance
Focus: Syntactic validity
Does NOT require:
Use case: Simple validators, syntax checkers
Focus: Semantic correctness
Use case: Complete XBRL processors, filing systems
Your Processor: Should aim for Full Conformance
Specification: XBRL 2.1 (2003)
Latest Version: Updated 2024
Test Count: ~500+ tests
Location: Member download area
Coverage:
Categories:
100 - Basic validity tests
200 - Schema tests
300 - Instance tests
400 - Linkbase tests
500 - DTS discovery tests
600 - Context tests
700 - Unit tests
800 - Fact tests
Example Test:
<!-- Test 301-01: Valid instance structure -->
<testcase id="301-01">
<name>Valid Instance Structure</name>
<description>
Instance with proper xbrli:xbrl root element,
valid context, valid unit, and valid fact.
</description>
<variation id="V-01">
<name>Basic valid instance</name>
<instance>301-01-valid.xml</instance>
<result>valid</result>
</variation>
</testcase>
Specification: XBRL Dimensions 1.0 (2006)
Latest Version: Updated 2024
Test Count: ~300+ tests
Coverage:
Categories:
100 - Dimension definitions
200 - Hypercube definitions
300 - Explicit dimensions
400 - Typed dimensions
500 - Relationship validation
600 - Instance validation
Specification: Formula 1.0 (multiple modules)
Latest Version: 2013 (REC)
Test Count: ~1000+ tests
Size: Large - multiple modules
Coverage:
Modules Tested:
Categories:
10000 - Variables
20000 - Filters
30000 - Value Assertions
40000 - Existence Assertions
50000 - Consistency Assertions
60000 - Formula Linkbase
70000 - Generic Linkbase
80000 - Function Registry
Example Test:
<!-- Test 30010: Value assertion - revenue must be positive -->
<testcase id="30010">
<name>Revenue Must Be Positive</name>
<variation id="V-01">
<instance>revenue-negative.xml</instance>
<result>
<error>assertion:assertionUnsatisfied</error>
</result>
</variation>
<variation id="V-02">
<instance>revenue-positive.xml</instance>
<result>valid</result>
</variation>
</testcase>
Specification: Inline XBRL 1.1 (2013)
Test Count: ~200+ tests
Coverage:
Example Test:
<!-- Test: Inline fact with transformation -->
<testcase id="inline-transform-01">
<name>Numeric Transformation</name>
<variation id="V-01">
<instance>transform-comma.html</instance>
<expected-value>1234567.89</expected-value>
</variation>
</testcase>
Specification: xBRL-JSON 1.0 (2021)
Test Count: ~100+ tests
Status: Recent addition
Coverage:
Specification: xBRL-CSV 1.0 (2021)
Test Count: ~150+ tests
Status: Recent addition
Coverage:
Specification: Extensible Enumerations 2.0 (2020)
Test Count: ~80+ tests
Coverage:
Specification: Generic Links 1.0
Test Count: ~100+ tests
Coverage:
Specification: Versioning 1.0
Test Count: ~150+ tests
Coverage:
Specification: Table Linkbase 1.0
Test Count: ~200+ tests
Coverage:
Specification: Taxonomy Packages 1.0 (2016)
Test Count: ~50+ tests
Coverage:
| Specification | Suite Available | Test Count | Priority for 2026 |
|---|---|---|---|
| XBRL 2.1 | ✅ Yes | ~500 | ⭐⭐⭐ Essential |
| Dimensions 1.0 | ✅ Yes | ~300 | ⭐⭐⭐ Essential |
| Formula 1.0 | ✅ Yes | ~1000 | ⭐⭐ High |
| Inline XBRL 1.1 | ✅ Yes | ~200 | ⭐⭐ High |
| xBRL-JSON 1.0 | ✅ Yes | ~100 | ⭐ Medium |
| xBRL-CSV 1.0 | ✅ Yes | ~150 | ⭐ Medium |
| Extensible Enum 2.0 | ✅ Yes | ~80 | ⭐ Medium |
| Generic Links 1.0 | ✅ Yes | ~100 | ⚪ Low |
| Versioning 1.0 | ✅ Yes | ~150 | ⚪ Low |
| Table Linkbase 1.0 | ✅ Yes | ~200 | ⚪ Low |
| Taxonomy Package 1.0 | ✅ Yes | ~50 | ⚪ Low |
| Calculations 2.0 | ⚠️ Likely | TBD | ⭐ Medium |
| Report Package 1.0 | ⚠️ Future | TBD | ⚪ Low |
Total Test Cases: ~3,000+ across all suites
Typical conformance suite organization:
XBRL-CONF-2024/
├── index.xml (Root test index)
├── catalog.xml (URL catalog)
├── README.txt (Documentation)
│
├── Common/ (Shared resources)
│ ├── lib/ (Common schemas)
│ └── base/ (Base taxonomies)
│
├── 100-BasicTests/ (Test group)
│ ├── testcases.xml (Test index)
│ ├── 101-ValidInstance/ (Test case)
│ │ ├── 101-testcase.xml (Test definition)
│ │ ├── 101-V01.xml (Test variation 1)
│ │ ├── 101-V02.xml (Test variation 2)
│ │ └── schema.xsd (Supporting files)
│ └── 102-InvalidContext/
│ └── ...
│
├── 200-SchemaTests/
│ └── ...
│
├── 300-InstanceTests/
│ └── ...
│
└── Documentation/
└── conformance-suite.html
<?xml version="1.0" encoding="UTF-8"?>
<testcase
xmlns="http://xbrl.org/2008/conformance"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xbrl.org/2008/conformance
../conformance-test.xsd"
minimal="false">
<!-- Test Case Metadata -->
<number>301</number>
<name>Context Validation</name>
<description>
Tests validation of context element structure,
including entity, period, and scenario.
</description>
<!-- Reference to specification -->
<reference
specification="XBRL-2.1"
id="http://www.xbrl.org/2003/role/recommendedDisclosure"
name="Context element (4.7)"
label="XBRL 2.1 Recommendation Section 4.7"/>
<!-- Test Variations -->
<variation id="V-01">
<name>Valid context with instant period</name>
<description>
Context with valid entity identifier and
instant period date.
</description>
<!-- Inputs -->
<data>
<schema>301-schema.xsd</schema>
<instance>301-V01-valid.xml</instance>
</data>
<!-- Expected Result -->
<result expected="valid"/>
</variation>
<variation id="V-02">
<name>Invalid context - missing entity</name>
<description>
Context without required entity element.
Must produce error.
</description>
<data>
<instance>301-V02-invalid.xml</instance>
</data>
<result expected="invalid">
<error>xbrl:entityMissing</error>
</result>
</variation>
<variation id="V-03">
<name>Invalid context - invalid period</name>
<description>
Context with malformed period element.
</description>
<data>
<instance>301-V03-invalid.xml</instance>
</data>
<result expected="invalid">
<error>xbrl:periodInvalid</error>
</result>
</variation>
</testcase>
<?xml version="1.0" encoding="UTF-8"?>
<testcases
xmlns="http://xbrl.org/2008/conformance"
name="XBRL 2.1 Conformance Suite"
date="2024-01-15">
<!-- Test Groups -->
<testcase
uri="100-BasicTests/101-ValidInstance/101-testcase.xml"
name="Valid Instance Structure"/>
<testcase
uri="100-BasicTests/102-InvalidRoot/102-testcase.xml"
name="Invalid Root Element"/>
<testcase
uri="200-SchemaTests/201-ConceptDeclaration/201-testcase.xml"
name="Concept Declaration"/>
<!-- ... hundreds more ... -->
</testcases>
Types of input files:
1. Valid/Invalid Classification
<result expected="valid"/>
or
<result expected="invalid">
<error>xbrl:contextDateInvalid</error>
</result>
2. Specific Error Codes
Error codes defined in specifications:
xbrl:contextMissingxbrl:periodInvalidxbrli:tupleContainingTextxbrldt:hypercubeMissing3. Calculated Values
For full conformance tests:
<result expected="valid">
<assertionSatisfied id="assertion1"/>
<assertionUnsatisfied id="assertion2">
<message>Revenue must be positive</message>
</assertionUnsatisfied>
</result>
4. Generated Files
Some tests require output files (PTVLI, PTVI):
<result>
<instance>expected-output.xml</instance>
</result>
┌─────────────────────────────────────────────┐
│ Test Harness │
├─────────────────────────────────────────────┤
│ 1. Parse test index XML │
│ 2. For each test case: │
│ - Load test definition │
│ - For each variation: │
│ • Prepare input files │
│ • Run processor │
│ • Capture output │
│ • Compare with expected │
│ • Record result (pass/fail) │
│ 3. Generate test report │
└─────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────┐
│ Your XBRL Processor │
│ (Being tested for conformance) │
└─────────────────────────────────────────────┘
public class XBRLConformanceTestHarness {
private File conformanceSuiteRoot;
private XBRLProcessor processor;
private TestResultCollector results;
public XBRLConformanceTestHarness(
File suiteRoot,
XBRLProcessor processor) {
this.conformanceSuiteRoot = suiteRoot;
this.processor = processor;
this.results = new TestResultCollector();
}
public TestReport runAllTests() throws Exception {
// Parse test index
File indexFile = new File(conformanceSuiteRoot, "index.xml");
TestIndex index = parseTestIndex(indexFile);
System.out.println("XBRL Conformance Test Suite");
System.out.println("Suite: " + index.getName());
System.out.println("Total test cases: " + index.getTestCaseCount());
System.out.println();
// Run each test case
int testNum = 0;
for (TestCaseReference ref : index.getTestCases()) {
testNum++;
System.out.printf("[%d/%d] Running: %s%n",
testNum,
index.getTestCaseCount(),
ref.getName());
runTestCase(ref);
}
// Generate report
return results.generateReport();
}
private void runTestCase(TestCaseReference ref)
throws Exception {
// Load test case XML
File testCaseFile = new File(conformanceSuiteRoot, ref.getUri());
TestCase testCase = parseTestCase(testCaseFile);
// Run each variation
for (TestVariation variation : testCase.getVariations()) {
runVariation(testCase, variation);
}
}
private void runVariation(TestCase testCase,
TestVariation variation) {
try {
System.out.printf(" Variation %s: %s%n",
variation.getId(),
variation.getName());
// Prepare inputs
List<File> inputFiles = resolveInputFiles(
testCase, variation);
// Run processor
ProcessorResult result = processor.process(inputFiles);
// Compare with expected result
boolean passed = compareResults(
result,
variation.getExpectedResult());
// Record result
if (passed) {
System.out.println(" ✓ PASS");
results.recordPass(testCase, variation);
} else {
System.out.println(" ✗ FAIL");
results.recordFailure(
testCase,
variation,
result,
variation.getExpectedResult());
}
} catch (Exception e) {
System.out.println(" ✗ ERROR: " + e.getMessage());
results.recordError(testCase, variation, e);
}
}
private boolean compareResults(ProcessorResult actual,
ExpectedResult expected) {
// Compare validity
if (expected.isValid() != actual.isValid()) {
return false;
}
// If expected invalid, check error codes
if (!expected.isValid()) {
Set<String> expectedErrors = expected.getErrorCodes();
Set<String> actualErrors = actual.getErrorCodes();
// Must produce at least one expected error
for (String expectedError : expectedErrors) {
if (actualErrors.contains(expectedError)) {
return true;
}
}
return false;
}
// For valid results, check any specific outputs
if (expected.hasExpectedOutput()) {
return compareOutputs(
actual.getOutput(),
expected.getOutput());
}
return true;
}
private List<File> resolveInputFiles(
TestCase testCase,
TestVariation variation) throws IOException {
List<File> files = new ArrayList<>();
File testCaseDir = testCase.getDirectory();
// Schema files
for (String schemaPath : variation.getSchemas()) {
files.add(new File(testCaseDir, schemaPath));
}
// Linkbase files
for (String linkbasePath : variation.getLinkbases()) {
files.add(new File(testCaseDir, linkbasePath));
}
// Instance files
for (String instancePath : variation.getInstances()) {
files.add(new File(testCaseDir, instancePath));
}
return files;
}
}
public class TestResultCollector {
private int totalTests = 0;
private int passedTests = 0;
private int failedTests = 0;
private int errorTests = 0;
private List<TestFailure> failures = new ArrayList<>();
private List<TestError> errors = new ArrayList<>();
public void recordPass(TestCase testCase,
TestVariation variation) {
totalTests++;
passedTests++;
}
public void recordFailure(TestCase testCase,
TestVariation variation,
ProcessorResult actual,
ExpectedResult expected) {
totalTests++;
failedTests++;
TestFailure failure = new TestFailure();
failure.setTestCase(testCase);
failure.setVariation(variation);
failure.setActualResult(actual);
failure.setExpectedResult(expected);
failures.add(failure);
}
public void recordError(TestCase testCase,
TestVariation variation,
Exception error) {
totalTests++;
errorTests++;
TestError testError = new TestError();
testError.setTestCase(testCase);
testError.setVariation(variation);
testError.setException(error);
errors.add(testError);
}
public TestReport generateReport() {
TestReport report = new TestReport();
report.setTotalTests(totalTests);
report.setPassedTests(passedTests);
report.setFailedTests(failedTests);
report.setErrorTests(errorTests);
report.setPassRate(
(double) passedTests / totalTests * 100.0);
report.setFailures(failures);
report.setErrors(errors);
return report;
}
public void printSummary() {
System.out.println();
System.out.println("=" .repeat(60));
System.out.println("CONFORMANCE TEST SUMMARY");
System.out.println("=" .repeat(60));
System.out.printf("Total Tests: %d%n", totalTests);
System.out.printf("Passed: %d (%.1f%%)%n",
passedTests,
(double) passedTests / totalTests * 100.0);
System.out.printf("Failed: %d (%.1f%%)%n",
failedTests,
(double) failedTests / totalTests * 100.0);
System.out.printf("Errors: %d (%.1f%%)%n",
errorTests,
(double) errorTests / totalTests * 100.0);
System.out.println("=" .repeat(60));
if (failedTests > 0) {
System.out.println();
System.out.println("FAILURES:");
for (TestFailure failure : failures) {
System.out.printf(" - %s / %s%n",
failure.getTestCase().getName(),
failure.getVariation().getName());
System.out.printf(" Expected: %s%n",
failure.getExpectedResult());
System.out.printf(" Actual: %s%n",
failure.getActualResult());
}
}
if (errorTests > 0) {
System.out.println();
System.out.println("ERRORS:");
for (TestError error : errors) {
System.out.printf(" - %s / %s%n",
error.getTestCase().getName(),
error.getVariation().getName());
System.out.printf(" Error: %s%n",
error.getException().getMessage());
}
}
}
}
public class ConformanceTestRunner {
public static void main(String[] args) throws Exception {
// Path to conformance suite
File suiteRoot = new File(
"/path/to/XBRL-CONF-2024");
// Your processor
XBRLProcessor processor = new YourXBRLProcessor();
// Create test harness
XBRLConformanceTestHarness harness =
new XBRLConformanceTestHarness(suiteRoot, processor);
// Run all tests
System.out.println("Starting conformance test run...");
System.out.println();
TestReport report = harness.runAllTests();
// Print summary
harness.getResults().printSummary();
// Generate HTML report
HTMLReportGenerator.generateReport(
report,
new File("conformance-report.html"));
// Exit with proper code
if (report.getFailedTests() == 0 &&
report.getErrorTests() == 0) {
System.out.println();
System.out.println("✓ ALL TESTS PASSED!");
System.exit(0);
} else {
System.out.println();
System.out.println("✗ SOME TESTS FAILED");
System.exit(1);
}
}
}
For updating your 2014-2018 processor:
XBRL 2.1 Core Tests
Dimensions Tests
New Format Tests
Formula Tests
Optional Specifications
1. XML Base Resolution
2. Context ID References
3. Period Type Validation
4. Dimensional Validation
5. XPath 2.0 Evaluation
6. Namespace Handling
7. Tuple Validation
8. Calculation Relationships
9. Linkbase Resolution
10. Substitution Groups
# .github/workflows/conformance-tests.yml
name: XBRL Conformance Tests
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
jobs:
conformance:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
- name: Cache conformance suites
uses: actions/cache@v3
with:
path: ~/conformance-suites
key: xbrl-conformance-2024
- name: Download conformance suites
run: |
./scripts/download-conformance-suites.sh
- name: Build processor
run: mvn clean package
- name: Run XBRL 2.1 tests
run: |
java -jar target/conformance-harness.jar \
--suite ~/conformance-suites/XBRL-CONF-2024 \
--report xbrl-21-report.xml
- name: Run Dimensions tests
run: |
java -jar target/conformance-harness.jar \
--suite ~/conformance-suites/XDT-CONF-2024 \
--report dimensions-report.xml
- name: Run Formula tests (if enabled)
if: env.FORMULA_ENABLED == 'true'
run: |
java -jar target/conformance-harness.jar \
--suite ~/conformance-suites/FORMULA-CONF-2013 \
--report formula-report.xml
- name: Generate HTML report
run: |
java -jar target/report-generator.jar \
--input *.xml \
--output conformance-report.html
- name: Upload report
uses: actions/upload-artifact@v3
with:
name: conformance-reports
path: |
*-report.xml
conformance-report.html
- name: Check results
run: |
./scripts/check-conformance-results.sh
- name: Fail if tests failed
if: failure()
run: exit 1
<!-- pom.xml -->
<project>
<properties>
<conformance.suite.root>
${user.home}/conformance-suites
</conformance.suite.root>
</properties>
<build>
<plugins>
<!-- Conformance test plugin -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>conformance-tests</id>
<phase>integration-test</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>
com.example.xbrl.ConformanceTestRunner
</mainClass>
<arguments>
<argument>
--suite-root=${conformance.suite.root}
</argument>
<argument>
--report=target/conformance-report.xml
</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>conformance</id>
<build>
<plugins>
<!-- Run conformance tests -->
</plugins>
</build>
</profile>
</profiles>
</project>
Usage:
# Run conformance tests
mvn clean verify -Pconformance
# Run specific suite
mvn exec:java -Dexec.mainClass="..." \
-Dexec.args="--suite XBRL-21 --report report.xml"
public class HTMLReportGenerator {
public static void generateReport(TestReport report, File output)
throws IOException {
StringBuilder html = new StringBuilder();
html.append("<!DOCTYPE html>\n");
html.append("<html>\n<head>\n");
html.append("<title>XBRL Conformance Test Report</title>\n");
html.append("<style>\n");
html.append(getCSS());
html.append("</style>\n");
html.append("</head>\n<body>\n");
// Header
html.append("<h1>XBRL Conformance Test Report</h1>\n");
html.append("<p>Generated: " + new Date() + "</p>\n");
// Summary
html.append("<div class='summary'>\n");
html.append("<h2>Summary</h2>\n");
html.append("<table>\n");
html.append("<tr><th>Total Tests</th><td>")
.append(report.getTotalTests())
.append("</td></tr>\n");
html.append("<tr><th>Passed</th><td class='pass'>")
.append(report.getPassedTests())
.append(" (")
.append(String.format("%.1f%%", report.getPassRate()))
.append(")</td></tr>\n");
html.append("<tr><th>Failed</th><td class='fail'>")
.append(report.getFailedTests())
.append("</td></tr>\n");
html.append("<tr><th>Errors</th><td class='error'>")
.append(report.getErrorTests())
.append("</td></tr>\n");
html.append("</table>\n");
html.append("</div>\n");
// Progress bar
double passPercent = report.getPassRate();
html.append("<div class='progress'>\n");
html.append("<div class='progress-bar' style='width: ")
.append(passPercent)
.append("%'></div>\n");
html.append("</div>\n");
// Failures
if (report.getFailedTests() > 0) {
html.append("<div class='failures'>\n");
html.append("<h2>Failures</h2>\n");
html.append("<table>\n");
html.append("<tr><th>Test</th><th>Variation</th>")
.append("<th>Expected</th><th>Actual</th></tr>\n");
for (TestFailure failure : report.getFailures()) {
html.append("<tr>\n");
html.append("<td>")
.append(escapeHtml(failure.getTestCase().getName()))
.append("</td>\n");
html.append("<td>")
.append(escapeHtml(failure.getVariation().getName()))
.append("</td>\n");
html.append("<td>")
.append(escapeHtml(
failure.getExpectedResult().toString()))
.append("</td>\n");
html.append("<td>")
.append(escapeHtml(
failure.getActualResult().toString()))
.append("</td>\n");
html.append("</tr>\n");
}
html.append("</table>\n");
html.append("</div>\n");
}
// Errors
if (report.getErrorTests() > 0) {
html.append("<div class='errors'>\n");
html.append("<h2>Errors</h2>\n");
html.append("<table>\n");
html.append("<tr><th>Test</th><th>Variation</th>")
.append("<th>Error</th></tr>\n");
for (TestError error : report.getErrors()) {
html.append("<tr>\n");
html.append("<td>")
.append(escapeHtml(error.getTestCase().getName()))
.append("</td>\n");
html.append("<td>")
.append(escapeHtml(error.getVariation().getName()))
.append("</td>\n");
html.append("<td><pre>")
.append(escapeHtml(getStackTrace(error.getException())))
.append("</pre></td>\n");
html.append("</tr>\n");
}
html.append("</table>\n");
html.append("</div>\n");
}
html.append("</body>\n</html>");
// Write to file
Files.writeString(output.toPath(), html.toString());
}
private static String getCSS() {
return """
body { font-family: Arial, sans-serif; margin: 20px; }
h1 { color: #333; }
h2 { color: #666; margin-top: 30px; }
table { border-collapse: collapse; width: 100%; margin: 20px 0; }
th, td { border: 1px solid #ddd; padding: 12px; text-align: left; }
th { background-color: #f2f2f2; }
.summary table { width: 50%; }
.pass { color: green; font-weight: bold; }
.fail { color: red; font-weight: bold; }
.error { color: orange; font-weight: bold; }
.progress { width: 100%; height: 30px; background: #f0f0f0;
border-radius: 5px; margin: 20px 0; }
.progress-bar { height: 100%; background: #4CAF50;
border-radius: 5px; }
pre { background: #f5f5f5; padding: 10px; overflow-x: auto; }
""";
}
}
Conformance suites are available to XBRL International members.
Membership Options:
Benefits:
Cost: Varies by membership level (~$1,000-$25,000/year)
For Members:
https://specifications.xbrl.org/
→ Each specification page has "Conformance Suite" link
→ Member login required
→ Download ZIP files
Conformance Suite Files:
XBRL-CONF-2024-01-15.zip (~50 MB)
XDT-CONF-2024-01-15.zip (~25 MB)
FORMULA-CONF-REC-2013-09-12.zip (~100 MB)
INLINE-XBRL-CONF-2013-11-18.zip (~30 MB)
... etc
# Create conformance directory
mkdir -p ~/conformance-suites
cd ~/conformance-suites
# Extract suites
unzip XBRL-CONF-2024-01-15.zip
unzip XDT-CONF-2024-01-15.zip
unzip FORMULA-CONF-REC-2013-09-12.zip
# Verify structure
ls -la
# XBRL-CONF-2024-01-15/
# XDT-CONF-2024-01-15/
# FORMULA-CONF-REC-2013-09-12/
1. Test-Driven Development
Write test → Run conformance → Fix issue → Repeat
2. Incremental Implementation
3. Regression Prevention
4. Issue Tracking
Suite Execution Time:
Optimization Strategies:
1. Parallel Execution
ExecutorService executor = Executors.newFixedThreadPool(4);
List<Future<TestResult>> futures = new ArrayList<>();
for (TestCase testCase : testCases) {
futures.add(executor.submit(() -> runTestCase(testCase)));
}
// Collect results
for (Future<TestResult> future : futures) {
TestResult result = future.get();
results.add(result);
}
2. Smart Caching
3. Selective Testing
Step-by-Step Process:
1. Isolate the test
java -jar harness.jar --test 301-03 --verbose
2. Read test documentation
<description>
Context with invalid period element.
Must produce xbrl:periodInvalid error.
</description>
3. Examine input files
cat 301-03-invalid.xml
4. Run processor manually
java -jar your-processor.jar 301-03-invalid.xml
5. Compare actual vs expected
Expected: xbrl:periodInvalid
Actual: xbrl:contextInvalid
6. Fix and re-test
# Fix code
# Re-run test
java -jar harness.jar --test 301-03
Week 1-2: Foundation
Week 3-4: Core Features
Week 5-8: Modern Formats
Week 9-12: Formula (if needed)
Week 13-16: Polish
Initial Target (3 months):
Production Ready (6 months):
Certification Target:
Official Resources:
Community Resources:
Example Implementations:
Conformance suites are essential for XBRL processor development. They provide:
For the 2014-2016 processor update:
Bottom line: Passing conformance suites is not optional - it's the definition of a conformant XBRL processor.
This document covers XBRL conformance testing as of January 2026, including all available conformance suites and modern testing practices.