Category: General

All articles I write

  • Mevo Accuracy Compared To Trackman (2022)

    Quick summary:

    • I compared the Mevo ($400) to the current gold standard of doppler radar launch monitors, the Trackman 4 ($19,000)
    • The Mevo performs extremely well for ball speed (on any given shot, less than 1% off)
    • The Mevo performs very well for club speed (usually less than 3% off)
    • The Mevo performs pretty well for carry (usually less than 5% off)
    • Spin and launch angle are highly erratic and are almost unusable (20-60% off)
    • Below are the simplest and fastest graphs to summarize the accuracy:

     

    Overview, methods:

    Picture of Mevo next to a Trackman 4 unit
    Where the magic happened….

    The Mevo ($450) is a portable radar launch monitor, which uses Doppler radar to measure different parameters of a golf shot. There is a wide variety of launch monitors at various price points, but at the top end for radar models is the Trackman 4 (starting at $18,995), which is the gold standard I used for these tests. Owning one is impractical for almost everyone outside of pros and coaches, but thankfully they can be rented for $30/hour where I live. I spent a couple sessions hitting on one and comparing its read of each shot to the Mevo. Both devices were calibrated for the altitude I usually play at in Akron (1,000 ft. above sea level).

    Data was collected indoors with 8 feet of unobstructed ball flight. Reflective metallic stickers were used on each ball to improve accuracy, per the recommendations of Flightscope. I used Taylormade TP5x’s for all shots. I include “2022” in the title because a disgusting amount of data interpretation happens behind the scenes with these companies and their proprietary calculations, and Flightscope’s firmware updates improve accuracy even as the hardware remains the same.

    I think it’s realistic to break this down to two parts – accuracy for individual shots, and accuracy for club gapping purposes (i.e., looking at averages for 5-10 shots with each club). For the latter, if someone hits 10 shots with each club, the individual variation doesn’t matter as much so long as the averages are close to the true (Trackman) averages. So for individual shots, I’m going to avoid using “average % error” as that could smooth out any nuance if the Mevo mixes over-reads with under-reads – it would improperly return a value very close to zero. I’ll use the absolute value of % difference on each shot. I’ll show the % difference for individual shots for a couple sets of data (there is too much data for this to be practical for each shot and club). For session averages, these will be true averages.

     

    Below is some of the raw data after I pasted everything from both Trackman and Flightscope’s websites to Excel:

    Example of some of the data used to generate these graphs

    Results: Comparing session averages (5-15 shots with each club)

    Results: Comparing the parameters for individual shots:

    For 16 swings of a driver, I compared the ball speed (mph) that Trackman measured and the ball speed that Mevo measured. Below is the percent difference from Trackman’s speed that Mevo’s measured on these 16 individual swings. The Mevo never missed a shot and never differed by more than 0.8%. On average, it slightly misread on the high side. It is important to point out that the differences here are extremely minute – the session average on Trackman was 149.2mph, whereas Mevo’s was 148.8mph. This difference is too small to have any actionable impact, even on a professional fitting.

    Below is the same data for the other parameters:

    Results: Average difference from Trackman (Abs):

    As discussed above, I think using the absolute value of the difference from Trackman on each individual shot is a fair way to gauge “how far from the true value might this be?” when you are using your Mevo on the range and receive a carry distance or ball speed. So below is the average of this difference on each shot:

    Discussion:

    Mevo performed admirably given the price point and its competitors. Its ball speed measurements are effectively indistinguishable from Trackman. At 160mph, a 0.5% difference would be reading 159.2-160.8mph, which is not significant enough for anyone to care about, outside of robotic equipment testing. Club speed is largely consistent, if slightly more inaccurate. Launch angle was consistently over-estimated. Spin was highly erroneous. Carry is influenced by spin and launch, and poorly read spin numbers influenced carry distance at times. To give an example, here is a tale of two reads with a 4 Hybrid shot that I struck thin:

    • Trackman: 127.2mph, 14.8* launch, 3300 spin → 199.1 yard carry
    • Mevo: 127.1mph, 20.8* launch, 7250 spin → 175.6 yard carry

    The gruesomely over-estimated spin and launch lead to a low carry distance, even though Mevo nailed the ball speed.

    I think the best use of the Mevo is if you have some sort of baseline familiarity with your launch monitor numbers, particularly ball speed. For people trying to build swing speed, ball speed is an important parameter to watch when using your driver. And the Mevo’s ball speed is effectively indistinguishable from your true (Trackman) ball speed. If Mevo reads a spin of 6,000rpm on a well struck driver shot, it is useful to be able to say “I know that’s not true” and throw out the carry distance, which would be skewed, while understanding the ball speed is still fine to interpret.

    Limitations:

    Trackman isn’t infallible, and even though it was treated as the ‘true’ values, it’s only an estimate. In indoor settings some professionals prefer photometric launch monitors (e.g., GC Quad). Its true strength lies in outdoor use where it can track the full flight of a shot. As these tests were performed indoors, they are limited. However Trackman likely represents the technical limitation of Doppler radar launch monitors as of 2022, and my goal was to see how close a $400 device can come to this standard.

    I also wonder how the prescence of more than one radar device affects reading. These units are effectively floodlights, however instead of visible light from a bulb, it’s longer radio waves emitted from an antenna (or in the case of Trackman, an array of multiple antennae at different frequencies). I wonder if the additional radiation from Trackman would actually improve the performance of the Mevo, which is normally limited by its small size and its meager power as a battery-operated device. If anyone has experience in signal processing and has insight… feel free to email me!

     

    I will add more data with irons this fall as I have time.

     

  • 2021 NFL All-COVID Team (Offense)

    Aaron Rodgers headlines our 2021 All-COVID team. Photo by me, September 30, 2012.

    I am honored to present the 2021-22 NFL All-COVID team, presented by the PFFA*, with the motto “COVID got their guy”. Defensive selections to be announced in a follow up post.

    *PFFA, the Pro Football Fans Association, is not a real association and its only member is me. For legal reasons I’d like to emphasize that I have no relationship to the NFL. 

    Brief selection notes:

    This is my selection of an All-Pro team that only consists of those who contracted COVID this season. Players who contracted COVID only during the 2020 season were not considered eligible (e.g., Trent Williams). Players who contracted it during the camps and OTAs that preceded the 2021 season were considered eligible (e.g., Penei Sewell). Players needed to test positive for the virus – only landing on the COVID list doesn’t count as that includes players in the protocol for a close contact, etc.

    The offensive selections:

    Offense
    PositionFirst teamSecond team
    QuarterbackAaron Rodgers, Green BayLamar Jackson, Baltimore
    Running backDalvin Cook, MinnesotaAustin Ekeler, Los Angeles
    Wide receiverDavante Adams, Green BayMike Evans, Tampa Bay
    Tyreek Hill, Kansas CityKeenan Allen, LA Chargers
    Amari Cooper, Kansas CityMike Williams, LA Chargers
    Tight endTravis Kelce, Kansas CityDarren Waller, Las Vegas
    Left tackleRashawn Slater, Los AngelesTaylor Lewan, Tennessee
    Left guardQuenton Nelson, IndianapolisJon Feliciano, Buffalo
    CenterCorey Linsley, Los AngelesMatt Paradis, Carolina
    Right guardZach Martin, DallasMark Glowinski, Indianapolis
    Right tacklePenei Sewell, DetroitD.J. Humphries, Arizona

     

  • Liquid Cooler for the Surface Pro 7

    Liquid cooler on the back of my Surface Pro

    I used a liquid cooler originally made for mobile phones to externally cool my Surface Pro 7.

    The Pro 7 with an i3 or i5 processor doesn’t come with a fan and relies on heat conduction through a heatsink and its metal frame to cool itself (‘passive cooling’). This is problematic when the device is under high load, or doing hard work. In most computers, a fan overlying the CPU would speed up to counteract the higher temperatures.  The Surface Pro does not have this luxury and therefore has no safety net to avoid unsafely high temperatures. Because of this, its only option when things get too hot is to throttle the power to its own CPU. As the CPU is the engine that runs the laptop, this slows the laptop down noticeably. To keep it from throttling I wanted to do something to improve its ability to stay cool.

    Methods

    Linked above, I used a liquid cooler from Aliexpress that was intended for mobile phone gamers. I ripped off the clasp that holds one end of the phone down and it is perfectly suited to sit on the back of a laptop.

    The heatsink would be most effectively placed over the hottest point on the device. In a Surface Pro, this is the part of the case that overlies the CPU (center-upper right if you are looking at the screen). I have included a thermal image below which supports my experience of this being true.

    from https://www.windowscentral.com/surface-pro-7

    Ideally we would have some sort of thermally conductive paste gluing the metal case to our liquid cooler’s heatsink, but as anyone who has installed a CPU can tell you, it’s gross and I don’t want it on my laptop. I found that the heatsink was able to do a satisfactory job just resting against the laptop.

    Effectiveness

    The graph I attached at the bottom uses Passmark’s PerformanceTest 10.1, specifically testing the CPU benchmark. The grey line demonstrates what will happen under typical conditions – after only 6 minutes of heavy use, CPU benchmarks have dropped by ~35%.

    With the liquid cooler running during these serial benchmarks, the CPU was able to maintain close to full speeds after several minutes of work. The raw data here involved a drop from 9944.6 to 9330.6, only a ~6% decrease in CPU benchmarks.

    If you are seeing this post, you’ve stumbled onto something I’m working on that is incomplete. I’ve published it anyway as I work on it

    Graph comparing performance under normal conditions and liquid cooled conditions