An iterative exploration of AI-assisted PCB variance analysis using Claude
Overview
As part of my ongoing ‘mad scientist’ automotive reverse engineering/hacking work, I recently needed to compare two somewhat similar circuit board — to identify any component-level differences between them. Rather than doing this purely by eye, I decided to try leveraging Claude (Anthropic’s AI assistant) to assist with the analysis.
What followed was an iterative back-and-forth process that tested the boundaries of what AI can and can’t do well with image-based circuit board analysis. This post documents each attempt, what worked, what didn’t, and what ultimately proved most useful.
The two boards under comparison are:
- Board 1 — Reference board, used in various General Motors (GM) vehicles to control certain interior components
- Board 2 — Comparison board, which is also commonly used in General Motors (GM) vehicles for other interior components.

Figure 1 — Reference Board

Figure 2 — Board 2: Comparison Module (top side)
Attempt 1: Initial Board Identification & Component Inventory
The Ask
The first step was simply to see how well Claude could identify what was on the board from a single high-resolution photo. I uploaded Board 1 and asked for a general analysis.
What Claude Got Right
Claude did a solid job of identifying the major, clearly visible components:
- Correctly identified all four TAIKO HTB2-160 12VDC relays and their 2×2 arrangement
- Identified the large QFP main MCU (noting the conformal coating made the markings unreadable)
- Correctly identified the two primary connectors — a white multi-pin harness connector and a smaller black blade connector
- Noted the board manufacturer and decoded the date code (week 34, 2005)
- Correctly assessed the board’s general condition as good with no visible damage
- Provided some useful insight on the general uses for some of the identified components.
- For some of the larger items that are not clearly marked, it was able to correctly infer what they were. For instance, in the upper right hand quadrant is a Motorola HC08 microcontroller, which it correctly guessed.
- Finally, using the information on the board, it was able to give some insight as to what this board may be used for, etc.
What Needed Improvement
The initial component list significantly under-counted the number of distinct components on the board. Many smaller SMD discretes, resistor networks, and secondary ICs were either missed entirely or lumped together without individual callouts. The markup image created at this stage was too sparse — labeling only the most obvious components rather than cataloguing everything.

Figure 3 — First annotated image attempt. Captured major components but missed significant detail.
Attempt 2: Automated Pixel-Difference Comparison
The Ask
With both board images in hand, the next logical step was to ask Claude to perform a difference analysis — essentially treating the two images as raw pixel data and highlighting regions where they diverge. The goal was to quickly flag all areas of variance without manual review.
The Approach
Claude computed a per-pixel absolute difference across all RGB channels, applied Gaussian smoothing to reduce JPEG compression noise, thresholded the result, and then used connected-component labeling to identify distinct “blobs” of difference. These regions were then drawn as numbered bounding boxes on both boards in a side-by-side comparison image, and also shown as a red heat-map overlay.

Figure 4 — Heatmap overlay showing pixel-level differences (red intensity = degree of difference).

Figure 5 — Side-by-side comparison with 39 flagged difference regions.
Why It Didn’t Work
The automated approach flagged 39 distinct regions — but after manual review, the vast majority were false positives caused by:
- Slight differences in camera angle and zoom between the two shots
- Lighting and shadow variation across the board surface
- The conformal coating creating different specular reflections
- Minor parallax shifts making the same components appear in slightly different pixel positions
Of the 39 flagged regions, only 2 were actually valid differences (regions 10 & 11). More critically, Claude’s automated approach missed some obvious real differences that a human eye catches immediately — such as a resistor populated at position 3 on Board 2 that is absent on Board 1.
NOTE: This is a known limitation of naive pixel-diff approaches for real-world photos. They work well for identical imaging conditions (same camera, tripod, lighting) but fall apart with hand-held photos under variable conditions.
Attempt 3: Zone-by-Zone Manual Walkthrough
The Ask
After the automated approach proved unreliable, the decision was made to switch to a more methodical, human-guided approach. Rather than asking Claude to find differences algorithmically, the board would be divided into a systematic grid of zones, and tight side-by-side crops of each zone would be generated for manual human review.
The Grid
The board was divided into a 4-column × 3-row grid, yielding 12 zones labeled A1–A4 (top row), B1–B4 (middle row), and C1–C4 (bottom row). Each zone was exported as a side-by-side comparison image showing both boards at matched zoom levels, with a 40-pixel overlap between zones to avoid missing anything at the boundaries.

Figure 6 — Zone grid overlay showing the 12 analysis areas on Board 1.
Narrowing the Focus
After reviewing all 12 zones, most showed no material differences. Attention was focused on four zones that warranted closer scrutiny: A3, A4, B3, and B4. Tighter crops were generated for each of these at higher resolution.
Zone B3 — Mid Center-Right (Resistor Ladder / Cap Area)
This zone contained the most significant confirmed differences. The board silkscreen labels a row of resistor positions as 0, 1, 2, 3.

Figure 7 — Zone B3 focused comparison. Left: Board 1 (Good Module). Right: Board 2 (Comparison).
Confirmed difference: Board 1 has resistors populated at positions 1 and 2 only. Board 2 has resistors at positions 1, 2, and 3. Position 0 is unpopulated on both boards. The additional resistor at position 3 on Board 2 represents a deliberate configuration difference between the two modules.
Zone B4 — Mid Right

Figure 8 — Zone B4 focused comparison.
A potential difference was noted in the stacked IC pair visible in this zone. The component marking reads 2432/1003 on Board 1 versus 2432/E001 on Board 2 — the lower IC marking differs. This may represent a different component variant or could be a conformal coating / angle artifact. Worth physical verification.
The resistor ladder (positions 0–3) continues into this zone from B3, showing the same population pattern — Board 2 has a resistor at position 3 where Board 1 does not.
Zone A3 — Upper Center-Right (MCU Area Top)

Figure 9 — Zone A3 focused comparison. Both boards appear identical in this zone.
No differences identified. Component count, placement, and types all match between both boards in this zone.
Zone A4 — Upper Right (IC / MCU Area)

Figure 10 — Zone A4 focused comparison. Cosmetic difference only.
No component-level differences identified. Board 2’s main MCU has a prominent scratch through its conformal coating (visible as a white diagonal line across the IC body), but this is a cosmetic/handling mark and not a component or configuration difference.
Conclusions & Takeaways
Confirmed Differences Between the Two Boards
- Resistor ladder positions 0–3 (Zone B3/B4): Board 1 populated at positions 1 & 2 only. Board 2 populated at positions 1, 2, & 3.
- Stacked IC marking (Zone B4): Board 1 shows 2432/1003; Board 2 shows 2432/E001 — requires physical verification.
- Main connector (Zone C4): Board 1 has a white connector housing; Board 2 has a red connector housing. (Functional equivalent, cosmetic difference.)
- Date codes on relays differ: Board 1 date code 053262 vs Board 2 date code 064362 — consistent with the different board manufacturing dates.
What Worked Well
- Claude was excellent at initial board identification — decoding silkscreen text, identifying major components, and providing context about the board’s likely function.
- The zone-by-zone grid approach was the most productive method — generating structured, human-reviewable comparison crops that made real differences obvious.
- Having Claude generate the comparison image infrastructure (cropping, labeling, side-by-side layout) saved significant manual work.
What Didn’t Work
- Automated pixel-difference analysis was not reliable for hand-held photos taken under variable conditions. It produced far too many false positives while still missing real differences.
- The initial component inventory was too conservative — Claude under-counted the number of discrete SMD components on the board.
- Even with the zone approach, some items were missed. For example, in Zone B4 there is a resistor difference in the ladder to the lower left of the MCU
AI as a Tool, Not a Solution
The key insight from this experiment is that AI assistance works best when paired with human domain knowledge directing the process. Claude was most valuable as an infrastructure tool — handling the image processing, crop generation, and layout work — while the human provided the judgment calls about what to look at and whether a flagged difference was real or a false positive.
A fully autonomous “just find all the differences” approach failed, but a collaborative human-AI workflow where the human guides the analysis zone-by-zone proved genuinely useful for this type of PCB comparison task.
Overview





















































































