My three most recent builds were using ASRock Rack server boards. The two boards
X470D4U and the
ROMED8-2T use the
The one that is different is the
X570D4I-2T which uses the
TPM2-SLI module. I looked around and couldn't find a place to buy the
TPM2-SLI module, so I decided to make my own. Since I was making one, I figured I would make the modules for the other boards too even though they were available for purchase.
I run Windows 11 on the
X470D4U and ESXi on the other two. ESXi does not support the fTPM implementation for host attestation, hence the need for a hardware TPM module instead.
This proved to be beneficial for me on the
X470D4U for two reasons:
- The latest bios available contains the AMD AGESA of 220.127.116.11 which has the stuttering issue when the fTPM is utilized on machines running Windows 11
- The GPU is inserted in the last slot of the board. The
TPM2-Smodule is a vertical module and this would prevent my GPU from being seated into the board itself
I started off with scouring the interwebs for someone who has done this before and
TheJeffChen over at the LTT forum posted about one he made for his ASUS board and was kind enough to post his completed board along with the components needed.
Dugged around more trying to understand the pin outs and those were detailed in the motherboard manual of my boards.
'*' denotes pin 1
The 14-pin module uses the following pinouts:
That's fine because when we refer to the Infineon Datasheet for
SLB9665 TPM module we see that all these pins are available. It's simply mapping the pins from the module to the header pins that will be plugged into your board. They even provide a sample schematic on how the chip is typically used.
The 18-pin module uses the following pinouts:
Now this pinout is confusing. The
SLB9665 only has 10 pins that match the pins in the above pinout configuration. Now this is awkward. Why do we have 4 additional pins that we have nowhere to hook up to? Took a little searching but this article was the gold mine for me in understanding that this 18-pin configuration was also used for TPM 1.2. The additional pins were simply a left over from the chips that support TPM 1.2. These are pretty modern boards so I'm not sure why ASRock decided to use the 18-pin configuration instead of the newer 14-pin configuration.
TPM 2.0 pinouts:
So now that I have my answers, I began designing the schematics per the datasheet and hooking them up for both the 14 and 18 pin configurations.
- PCB (Fabbed mine at JLCPCB)
- 1 x 2x7 and 2x9 2.0mm pin header
- 5 x 100nF SMD capacitors
- 2 x 10K 0603 SMD resistor
I had issues with finding the 2x9 pin headers, but they had 2x10 headers available, so I just bought that, snipped off 1 column of pins and sanded it down to get it smooth.
If you live in Singapore and want to try this out for yourself, I can ship you some boards of either variation as long as you pay for shipping or self collect.
If you're outside of Singapore, fabricating it yourself may be cheaper? But hey if you want to pay for shipping from Singapore, I'll oblige too.
I've included the gerber files in the repo:
This post was delayed because of the 18-pin module as it was not working when I tested it. I only realized that ASRock had the pinouts reversed in their manuals and once I flipped the fabricated board over, it worked instantly. So now I can push this post up.
The other issue that I faced was that I needed to replace the pin headers with a right angle variant due to the GPU being above. My mistake turned out to be correct and the board is now facing downwards (exactly how I wanted it).
Depending on where you buy your TPM chips from, you may need to update the firmware on it. Use the following guides to upgrade the TPM firmware:
- Windows: https://silvenga.com/upgrading-firmware-infineon-tpm/
- Linux: https://qzhou.dev/updating-a-vulnerable-tpm
Firmware versions ending with '.2' seem to indicate an FIPS 140-2 compliant chip