Sunken Ships at Ironbottom Sound

 

Quote of the Day

Why the US Navy performed so poorly in sea battles during the early part of World War II:

  1. The United States failed to grasp that the killing weapon was the torpedo.
  2. The United States had no tactics suitable for night battle at close quarters.
  3. The United States was slow to learn. Because of the rapid turnover of tactical. leaders, the pace of the battles overwhelmed the Americans.
  4. Above all, the United States did not exploit its potentially decisive radar advantage.

— My paraphrase of Capt. Wayne Hughes from his book Fleet Tactics.


Introduction

Figure 1: WW2 Ships Sunk in Ironbottom Sound. (Wikipedia)

Figure 1: WW2 Ships Sunk in Ironbottom Sound. (Wikipedia)

I just finished watching a series of videos on the Guadalcanal Campaign by Drachinifel, whose work is superb (Figure 2). The marines derisively referred to this campaign as Operation Shoestring because of the resource limitations. Things were no better for the sailors. Unlike many WW2 island campaigns, more sailors died in the battles than ground troops (link). The Allies, and in particular the US Navy (USN), had to learn the hard way that the Imperial Japanese Navy (IJN) was a force that deserved respect. Many Allied ships were sunk while learning this lesson.

Ironbottom Sound was called Savo Sound before WW2, but the number of ships sunk there was so staggering that the sound is now commonly called Ironbottom Sound. This post will extract the names of the sunken ships from the image in Figure 1 using an Online Character Recognizer (OCR). I will then create some tables that look at the casualties by ship type and country of origin.

For those who like to follow along, my spreadsheet is here.

Background

Scope

I am looking at the ships listed in Figure 1. Some ships officially listed as lost at Guadalcanal are not shown in Figure 1. For example, the USS Benham is not shown but was seriously damaged on 15-Nov-1942 and ended up being sunk by Allied gunfire. The USS Jarvis and USS Juneau are also not shown. I do not know why, but their locations may not be known.

Battle of Savo Island

The video shown in Figure 2 provides a good overview of the Battle of Savo Island, which may be the worst loss in USN history and contributed four wrecks to Ironbottom Sound. The tale of this battle highlights many of the issues that the Allies would need to resolve quickly

Figure 2: Good Video Briefing on Early Guadalcanal Campaign.

This battle also shows the importance of leadership. If you want to see an example of how not to lead a naval force, read about Captain Howard Bode.

Analysis

OCR

I used the online and free Convertio tool to generate a spreadsheet of all the text in Figure 1. I needed to delete the lightly colored geographic names from the conversion.

Pivot Tables

Once the OCR was cleaned up, I augmented the image data with naval service and ship type information,  and then made pivot tables of the result.

Wrecks By Nation/Service

Figure 3 shows the number of wrecks by the naval service. Notice how the USN took a real beating.

Figure M: List of Ironbottom Sound Wrecks By Nation/Service.

Figure 3: Ironbottom Sound Wrecks By Navy.

Wrecks By Ship Type

Figure 4 shows the wrecks by the type of ship. As far as the military ships go, destroyers and cruisers dominate the losses. I should mention the loss of the oiler Kanawha. The oilers were critical to the US Navy maintaining its tempo of operations. The US Navy early in WW2 was seriously short of oilers, particularly fleet oilers capable of Underway Replenishment (UNREP). For example, the loss of the fleet oiler USS Neosho at the Battle of Coral Sea was considered almost as serious as the loss of the carrier USS Lexington.

Figure M: List of Ironbottom Sound Wrecks By Ship Type.

Figure 4: List of Ironbottom Sound Wrecks By Ship Type.

US Navy Wrecks By Type

Figure 5, shows the US Navy wrecks by type. Of the seven cruisers wrecks at Ironbottom Sound, six are from the US Navy and the seventh is Australian. The US cruisers really took a beating, particularly from torpedoes like the IJN Long Lance.

Figure M: US Navy Wrecks By Type.

Figure 5: US Navy Wrecks By Type.

US Navy Cruiser Wrecks

Figure 6 shows the US Navy cruiser wrecks. One of the wrecks is the bow of the USS Minneapolis, which a Long Lance torpedo hit separated from the rest of the ship. The USS Minneapolis was repaired and served out the rest of the war.

Figure M: US Navy Cruiser Losses.

Figure 6: US Navy Cruiser Losses.

Conclusion

While the Guadalcanal Campaign was a brutal battle, there were some lessons that probably could not be learned any other way than through enemy engagement. Here are some of the key things that the Allies had to learn:

  • Night tactics
    The IJN ruled the night in the early battles. The Allies had better radar but did not know how to properly use it.
  • Radar tactics
    Early radar was tricky to use for fire control but properly used was decisive in night battles. The Allies had a pioneer in its midst, Admiral Willis Lee, and needed to learn from him. He demonstrated his prowess during the Second Naval Battle of Guadalcanal.
  • Identify wartime leaders
    Wartime leaders are different from peacetime leaders and must be identified as soon as possible. This problem existed with both the naval and ground forces.
  • Develop tactics relevant to the time
    Night fighting, long-range torpedoes, and the need for multi-national naval forces required the Allies to develop new tactics.

I would argue the USN, in particular, did not learn quickly. However, there are people who might disagree (Trent Hone in his excellent book Learning War).

Appendix

Excerpt from Battles That Changed American History: 100 of the Greatest Victories and Defeats by Spence Tucker (link).

Figure M: Tucker Excerpt

Posted in Excel, History Through Spreadsheets, Naval History | Leave a comment

US Navy Ship Numbers Versus Time

 

Quote of the Day

The world always seems brighter when you've just made something that wasn't there before.

Neil Gaiman. Yup – this is what I love about engineering. I get to make new things all the time.


Introduction

Figure 1: Landing Craft, Vehicle, Personnel (LCVP). The most common type of USN vessel during WW2.

Figure 1: Landing Craft, Vehicle, Personnel (LCVP). The most common type of USN vessel during WW2.

I have been researching some of the work of Alfred Thayer Mahan, a US Navy officer and strategist, that was an influential voice in the 19th-century military circles. He was a strong advocate for the role of sea power in securing economic opportunity (link). This research has motivated me to determine how US Navy ship numbers have varied with time.

Mahan wrote when the US had a very weak navy and advocated for increasing the navy's number of capital ships (link) and foreign bases. Interestingly, Mahan was a rather poor ship captain and was involved in a number of ship collisions (link). I always find it interesting when you find strategists that are not particularly good at low-level functions – yet we usually promote people based on their low-level performance. Similarly, Nimitz was a good strategist but, like Mahan, I would not want him conning my ship (link).

My reading and watching lectures on Mahan have motivated me to look at how the US Navy grew and shrank over the years. Fortunately, the Naval History and Heritage Command (NHHC) have an excellent page on the size of the US Navy over time. Unfortunately, the data is scattered throughout the page and it must be scrapped so that I can consolidate and graph it. This post is about using Power Query to scrape the data from the page and generate a graph in Excel of the number of active ships in the US Navy over time.

For those who like to follow along, my Excel workbook is here.

Background

Mahan Background

My analysis was motivated by a statement in this Youtube lecture on Mahan where John Kuehn mentioned that the US Navy in the late 1800s has less than 40 ships. I was surprised at that number and I wanted to investigate further.

Figure 2: Good Mahan Video Briefing.

Analysis

Approach

The NHHC website is a bit odd in that it has broken up the US Navy ship number data into multiple tables that contain various numbers of years worth of ship type and number data. I could use R or Python to scrape the page, but I am in a hurry and Power Query grabs web tables very cleanly and quickly. Here is the process:

  • Grab all the tables on the page
  • Filter for the tables with ship number data
  • Use a PQ function to grab the total active ship number data for each year in each table.
  • Consolidate all the table
  • Generate the Graph

Graph

Figure 3 show my plot of the scraped ship number data.

Figure 3: US Navy Total Active Ships Versus Time.

Figure 3: US Navy Total Active Ships Versus Time.

Observations

Here is what the graph tells me:

  • The US Navy ship count at the end of WW2 is amazing.
  • The US Navy ship count two years after the end of WW2 was equally amazing.
  • The most common WW2 ship type was "amphibious" and the most common amphibious ship was a Landing Craft, Vehicle, Personnel (LCVP), which makes sense. The LCVP is shown in Figure 1.
  • The US Navy had a small increase in ship numbers during WW1, but US participation was so short that many of those ships were not finished by war's end.
  • The current US Navy ship numbers in actually relatively low.

Conclusion

I am going to use this as an exercise for Excel students because I was able to solve a relatively tough problem very quickly (I had data in about 10 minutes). The problem provides students with experience in:

  • Power Query web scraping
  • Cleaning up web tables
  • Writing a Power Query M-function
  • Generating a graph from a table
Posted in Excel, Military History | Leave a comment

Weed and Feed Math

 

Quote of the Day

Fabius used to say that the basest excuse for a commanding officer is 'I didn't think it would happen,' but I say it's the basest for anyone.

Seneca, from his book 'Of Anger.' My management failures were all from lack of imagination – I didn't think it would happen.


Introduction

Figure 1: Weed and Feed Fertilizer Bag I Found.

Two weeks ago, the grass around my garage looked pretty scraggly and weed-infested, so I decided it was time for fertilizer and weed-killer. I am not very knowledgeable about lawns and lawn care, so I decided to research online. This research is summarized in this post. Yes, it was time for some fertilizer and weed-killer.

Since the nearest lawn care store is 40 miles away and the COVID delta variant is active, I decided to dig around my garage for my old yard supplies and found a bag of a common weed and feed product that I had purchased years ago but never applied (Figure 1). There are pros and cons to this product, but it is here and easy to use.

This post will use US customary units because the product and my lawn are all specified in customary units. The bag is rated to cover 15,000 square feet of lawn. I have about 10,000 square feet of garage lawn, so I have enough for one application.

All my analysis was performed using the MapleFlow computer algebra system, which I am trying out for my engineering work.

Background

Reference

I decided to use the following sources for my information on fertilizers

  • North Carolina Department of Agriculture and Customer Services (link)
  • University of Georgia Extension Service (link)
  • Texas A&M Extension (link)

Their information is well-laid out and easy to use.

Definitions

NPK Numbers
Every fertilizer bag has a string of three numbers on it (e.g. "28-0-3"), which refers to the nitrogen (N)-phosphate (P)-potash (K) percentages by weight. The nitrogen is usually supplied by the chemical urea (CH4N2O). Phosphate (P2O5) and potash (K2O) supply phosphorus (chemical symbol P) and potassium (chemical symbol K), respectively.

This actually an inconsistent way to express these values because the numbers express nitrogen (elemental)-phosphate (compound)-potash (compound) percentages. This means you have a bit of stoichiometry to do if you want to determine the amount of phosphorus and potassium spread. However, folks usually worry about nitrogen, so it makes sense to express that percentage directly.

Parameters

There are three parameters that I want to compute.

Fertilizer Rate [lb/1000 sq. ft.]
The weight of fertilizer (lbs) applied per 1000 square feet of lawn.
Nitrogen Rate [lb/1000 sq. ft.]
The weight of nitrogen (lbs) applied per 1000 square feet of lawn.
Fertilizer Spreader Setting [unitless]
The setting on my fertilizer spreader corresponding to my desired fertilizer rate.

Analysis

Fertilizer Rate

The fertilizer rate can be computed as shown in Figure 2.

Figure 2: Fertilizer Rate Calculation.

Nitrogen Content

Texas A&M says that you should never apply more than 1 lb of nitrogen per 1000 square feet of lawn. Figure 3 shows my analysis of Scott's Weed and Feed to determine the amount of nitrogen that will be spread according to their instructions. My numbers show that 0.8 lbs of nitrogen will be spread per 1000 square feet of lawn, which is reasonable.

Figure 3: Nitrogen Density.

Figure 3: Nitrogen Density.

Ideally, I would have performed a nitrogen test on my lawn to determine exactly the amount of nitrogen that I need to apply to my lawn. As usual, I am in a hurry, and performing the testing probably isn't worth it on a lawn that is clearly low on nitrogen (my grass is thin, and has many yellow spots).

Dispenser Calibration

I use the fertilizer spreader shown in Figure 4 (Scotts EdgeGuard DLX broadcast spreader). The rate of fertilizer spread is controlled through a dial that is numbered from 2 to 15 around its circumference.

Figure M: My Fertilizer Spreader.

Figure 4: My Fertilizer Spreader.

I needed to calibrate the dial for my fertilizer to determine the correct setting for the fertilizer I am applying. I decide to apply fertilizer on three measured patches of lawn at three different settings to determine the rate versus dial setting. My test process is simple and for each test I:

  • Measured the patch of grass to get its area
  • Put fertilizer in the spreader and weigh the whole thing
  • Spread the fertilizer at the specified test setting over the test area
  • Weigh the spreader and fertilizer afterward to determine the amount of fertilizer spread
  • Record the weight of fertilizer spread, test area, and calculate the fertilizer rate.

My test data is shown in Figure 5. I started my testing with the setting S=6 because of an online recommendation. I found out that setting gave me too high a fertilizer rate.

Figure 5: Measured Pounds Per 1000 Square Foot.

I performed the linear regression shown in Figure 6. It looks like the relationship between the fertilizer rate R [lbs/1000 sq. ft.] and the spreader dial setting S is shown in Equation 1.

Eq. 1 $latex \displaystyle R=0.665\cdot S$

where

  • R is the rate of fertilizer spread [lbs/1000 ft2]
  • S is the spreader dial setting [unitless]
  • I am ignoring the intercept because it is small

Figure 6 shows the graph of my data.

Figure 6: Fertilizer Calibration Test Regression.

Figure 6: Fertilizer Calibration Test Regression.

Application Pattern

I spread the fertilizer two weeks ago and I can see that stripes in my lawn, which indicates inconsistent spreading. Next time, I will cut the application rate in half and spread the fertilizer in a criss-cross pattern (2 passes).

Conclusion

Based on my test results, I needed to set my spreader to S=2.858/0.665=4.3. I applied the fertilizer 2 weeks ago. My grass has really greened up and the weed situation is better, but there is still a weed problem. I will need to focus on this for a while.

PS

My wife watched a Youtube yard care channel (I don't know which one), and their calibration setting for the same spreader and fertilizer was 4.4. So I think my results are at least partially confirmed.

Posted in Cabin | 3 Comments

Lightning Damage at My Garage

 

Quote of the Day

When you counsel someone, you should appear to be reminding him of something he had forgotten, not of the light he was unable to see.

Baltasar Gracian, Spanish philosopher  on how to give advice.


Burn Residue
During a major storm two weeks ago with 80 mile per hour winds (129 kph) and heavy rain,  a lightning strike near my garage destroyed two TVs, a power adapter for my robot lawnmowers, a wireless router, two data switches, and my furnace's propane tank regulator and copper feedline. As an illustration of the damage, Figure 1 shows the residue left from my mower power adapter exploding. The light region inside the smudge is from my fingers wiping across the smudge to see how thick the film was. The film was removed using mineral spirits. A small burn mark is all that remains.

For those who wonder, I have all sorts of lightning protection on everything, including a dielectric separator between the propane tank and the furnace. This saved quite a bit my electrical equipment, but not everything.

I am in the process of fixing everything. Because the power adapter is so damaged and a number of my circuit breakers tripping, I am guessing the surge came in through the garage power system. My cabin also suffered some damage because its electrical system is connected to the garage over a 120 foot (37 meter) cable.

Figure 2 shows the blown-up power adapter. I found the cover 13 feet (4 meters) away in the woods. The copper feed line to my propane tank was turned black and it developed a major leak, which allowed all the propane to escape into the soil. The smell of the odorant in the propane (ethyl mercaptan) is noticeable from the soil. I included a 1-2-3 block in Figures 1 and 2 for a size reference.

Figure 2: Blown Power Adapter.

Figure 2: Blown Power Adapter.

Posted in Cabin | 2 Comments

Excel Spillable Ranges are Great!

 

Quote of the Day

I begin to speak only when I'm certain what I'll say isn't better left unsaid.

Cato the Younger. I try to keep this quote in mind during meetings.


Figure 1: Toy Data Set With Same Format as My Work Example.

Figure 1: Toy Data Set With Same Format as My Work Example.

I use Python or R for my large-scale data work, but I do find Excel a very powerful ad hoc data analysis tool, particularly with some of the new functions that use spillable ranges. Today, I was given a large table of Engineering Change Orders (ECOs) and a comma-separated list of the documents each ECO affected (very abbreviated form shown in Figure 1). I needed to generate a count of the number of times each document was modified by an ECO and I needed that count for a meeting coming up in a few minutes. My approach can be described as follows:

    • Use TEXTJOIN to combine all the comma-separated lists into one long comma-separated list.
    • Use FILTERXLM to break up the combined comma-separated list into a column with one document per row
    • Use UNIQUE to generate a list of the unique documents that were changed
    • Use COUNTIF to count the number of times each individual document was changed

Figure 2 shows a screenshot of my Excel workbook. For those who like to work along, I include my workbook here.

Figure 2: Functions Applied to Generate Document Change Counts.

Figure 2: Functions Applied to Generate Document Change Counts.

Posted in Excel | Leave a comment

Effect of Lumber Price Rise on House Cost

 

Quote of the Day

Great things are not done by impulse, but by a series of small things brought together.

Vincent van Gogh


Introduction

Lumber Prices

Figure 1: Lumber Prices Versus Time (NASDAQ data, My Graph)

I recently went into a home center and was shocked to see a 2‑in x 4‑in stud now costs $11 that used to cost $2.50. A 4‑foot x 8‑foot sheet of plywood now costs $100 that used to cost $25. These prices were so high that I decided to delay building a shed until the prices had dropped to more sane levels.

Unfortunately, a perfect storm of COVID panic and bizarre trade policies has caused a massive rise in US lumber prices (Figure 1). Lumber prices are typically around $300 per 1000 board-feet (units discussed below). These prices recently peaked at over $1700 per 1000 board-feet.

What impact did these high lumber prices have on the price of a home? Time to do a little math …

For this post, I will be performing my calculations using a new tool called Maple Flow. I want to give it a test drive.

Background

Yet Another Apology for Units

While my raw US Forest Service Wood Product data is metric, the commodity price data is in US Customary units. Because this post is focused on the US housing market, my final results will in the form that US readers can use directly.

Definitions

Board-Feet
Wood volume in the US is usually expressed in board-feet, which has units of ft2·in. There are 423.8 board-feet in a cubic meter of wood.
House Area
House area in the US is usually expressed in square feet, which has units of ft2.

Wood Usage Per Home Data

All of my calculations will be based on 2019 data from this US Forest Service document, where Table 3 contains this information.

Figure 2: US Forest Service Wood for Housing Data.

Figure 2: US Forest Service Wood for Housing Data.

I am only interested in the single-family house data. Here is the information critical to my analysis (Figure 2):

  • Floor Area Per Unit: ASFH := 234.5 m2 (2524 ft2)
  • Wood Volume Per Unit: VWood := 44.0 m3 (18.65E3 board-feet/home)

Analysis

Wood Cost Increase

Figure 1 provides the following information:

  • Maximum wood cost per board-foot: $1,733.50/1000 board-feet
  • Average wood cost per board-foot (pre-COVID); $392.81/1000 board-feet (data mean before 1-Jan-2021)

The price rise in wood products can be computed as shown in Equation 1.

Eq. 1 $latex \displaystyle \Delta W=\$1733.50-\$392.81=\$1,340.69 \text{ per 1000 board-feet}$

For those who like to follow along, my Excel workbook of historical lumber data is available here.

Mean Board-Feet Per Square Foot of Housed

Equation 1 shows how we can calculate the mean board feet (BfA) per square foot of the home. This number includes both lumber and sheet goods. I should mention that a quick Google search will give you a rough number of 6.3 board-feet per square foot of house area (example). I find this number to be a bit low.

Eq. 1 $latex \displaystyle k_{BdFtPerSqFt}=\frac{{{{V}_{{Wood}}}}}{{{{A}_{{SFH}}}}}=\frac{{18.65E3}}{{2524}}=7.34\frac{{\text{bdft}}}{{\text{f}{{\text{t}}^{2}}}}$

Estimate Home Cost Increase at the Peak of the Wood Cost Boom

Assuming that houses built in 2019 and 2021 are similar, we now can compute that impact of the recent wood cost increase on the price of a typical home using Equation 2.

Eq. 2 $latex \displaystyle \Delta C=\Delta W\cdot k_{BdFtPerSqFt}\cdot {{A}_{{SFH}}}= {1,340.69}\cdot 7.34\cdot 2524=24,999\ \text{USD}$

Here is a screenshot of all my calculations in Maple Flow.

Figure 3: Maple Flow Version of Calculations.

Conclusion

At the peak of the recent wood cost increase, the cost of a typical home increased $25K just for wood alone. I should point out that other materials, such as drywall, have also increased in price. Fortunately, we are seeing the wood prices decrease rapidly as more supply is coming online.

I feel bad for the young people trying to buy a house in recent years. Between the Great Recession and COVID, young people entering both the housing and job markets have been subjected to terrible uncertainty.

Posted in Construction | 3 Comments

Our House Design Process

 

Quote of the Day

Find out who you are and do it on purpose.

- Dolly Parton


Introduction

Cabin Today

Figure 1: Cabin Today. (Source: Dawn Biegert)

A number of people have asked questions about the process we used to design and build our cabin in northern Minnesota (Figure 1), a build that I have discussed in other posts (link, link). The questions center on the tools we used, what the build process was like, and things to watch for. The following post examines our design and build process from the standpoint of what went well and not so well.

Overall, I was very happy with how things turned out. We have been in the house now for 3 years and have had no major issues. The minor issues were mainly simple screw pops in the drywall and some drywall seams that separated. We also had some outlets that were in the wrong places and I had to move them. Normally, we would have caught the outlet position issues during construction, but we were not on-site often enough.

While things went well, I don't think we would build another custom home. It was too much work – one was enough.

Design

High-Level Design

We had some basic requirements on the structure:

  • Comply with Local Building Codes
    There were a large number of building codes related to the structure being close to the lake. One of the main constraints was that we could only grow the footprint of the old structure by 50%. To get enough area for our children and grandchildren, we needed to add a second floor. The footprint of the house was also limited by building restrictions on how close the house could be to the lake and a nearby road. The house placement was highly constrained.
  • Handicapped-Accessible First Floor
    We have handicapped friends that we want to be able to use our home comfortably and we wanted to consider our needs when we are old and want to continue living here. We decided on a home with:

    • no basement
    • laundry, kitchen, curbless shower on the first floor
    • stepless entrances/exits
    • made provisions for adding a home elevator for access to the second-floor if that becomes necessary
    • because of the house size restrictions, we decided to add a large garage on another lot that would provide a shop and an entertainment area. Little did we know that handicapped accessibility would be important right away as my wife would suffer a serious injury and would be laid up for the better part of a year (she is fine now).

Concept Development

My wife spent years collecting pictures of homes that she thought would fit our north woods lifestyle. She put together a collection of ~100 home designs that had features that she liked, for instance, lighting ideas like wall lights or pendant lights, carpet colors, what wood the furniture would be, how the kitchen would be set out, everything under the sun she had already gathered ideas for. Eventually, she found a plan for a boathouse that became her inspiration for the overall look of the house.

Detailed Design

Designing this house was a lifetime dream for my wife and she started by drawing some simple hand drawings (she is a skilled hand draftsman). However, she had a large number of alternatives she wanted to evaluate, and hand drawing proved to be just too slow.

Figure M: Chief Architect Home Designer.

Figure 2: Chief Architect Home Designer Package.

To speed her ability to try different options, we decided to move her from hand drawing to a 3D modeling program. This worked really well. For house design, we ended up choosing Home Designer by Chief Architect (Figure 2). It was easy to use, had excellent Youtube training videos, and had an excellent library of house components directly from the manufacturers (e.g. windows) that provided realistic renderings of her house concepts. I think we spent ~$300. I captured the initial draft of the house based on the original boathouse concept. My wife then did everything else.

One thing that we did miss during the design phase was allocating enough space for storing snow while snowplowing. While this issue is not directly about the house, it has become a big issue because the snow plower has to push snow up on the lawn. The plowed snow contains rock and other debris from our driveway, which is then deposited on our lawn every spring when the piles of snow melt. He also damages the lawn during the plowing operating. We are getting ready to buy a large snowblower that we can blow the snow over the whole yard and spread out the problem, but it would have been simpler to have clear areas for the snowplow to push the snow into.

Reality Phase

Builder Selection

Our location deep in the woods limited our builder options. We were fortunate to find Mark Utecht, a builder from the Twin Cities (Minneapolis/St. Paul) who had decided to move to the north woods. He had extensive experience building custom homes similar to ours and we liked his commitment to quality.

Figure M: Pettibone (Source: Pettibone).

Figure 3: Pettibone (Source: Pettibone).

I worked on construction sites as a drywall installer during high school and I saw quite a few carpenters. Utecht may be the best hammer and nail man I have ever seen. While he contracted out the electrical, HVAC/plumbing, concrete, and roofing tasks, he did everything else himself. He was interesting to watch because he was a master of working alone and was a magician with a Pettibone (Figure 3). Every motion was well-practiced and efficient. I love to watch a master in action. When it comes to work that needs doing like plumbing, it's usually best to go for experts in that particular field to do that job as there's too much to mess up and that could go wrong were you to do it yourself. This is why reaching out to the likes of Apollo Plumbing is a popular decision when working on construction and remodeling projects.

To simplify getting cost estimates, Utecht requested that we take our plans to a draftsman who would check what Chief Architect generated for code compliance, generate drawings in a format that Utecht was used to, and creating a part database for cost estimation. This cost $1500 and I think was worth it as we began to iterate on costs.

Like most custom home builds, the initial cost estimates were far over our budget. Utecht was experienced at getting the most value for the dollar and he worked with my wife and the draftsman to hit our cost targets. This activity took a couple of months.

Construction Phase

Building the structure took about six months. Overall, there was relatively little drama during the construction. However, there were some miscommunications that normally would be easy to deal with if my wife could have been on-site every day. However, we live 200 miles from the construction site and daily inspections were out of the question. Also, there is no cellular service nor Internet at the site.

Fortunately, I was able to get a DSL line to the site and I set up cameras for remote monitoring. This helped and we were able to correct some errors from long-distance, but remote monitoring was not adequate. Some painful rework ended up being needed.

Overall Assessment

Table 1 shows a score that represents my feeling about the build and how it went.

Table 1: Cabin Build Scorecard.
Category Quality Conformance to Plan Comments
General Contractor A A- Very easy to work with. My only regret is that he did not put in a sump basket after the HVAC guy advised him to. This has been annoying because the lift pump (for combustion condensate) has been failure-prone.
HVAC/Plumbing A A No issues of any sort.
Sheetrocking B A Lots of screw pops started right away. A number of edges have experienced paper separation. I am able to repair these.
Painting B A Some areas were not smooth, brush strokes visible.
Electrical A B Did not follow the instructions for placing a number of outlets. This has been annoying and I had to move a number of outlets.
Concrete A A The house is built on a heated slab. It has provided us very comfortable heating. We also have some outdoor stamped concrete work that we really like.
3D Design Software B+ N/A The software was critical to getting the home we wanted. The software made no calculation mistakes but did have some annoying bugs. Nothing earth-shattering, but annoying.
Posted in Cabin, Construction | Leave a comment

Ben Franklin and the Gulf Stream

 

One trains one's imagination to go visiting.

Hannah Arendt


Figure 1: Ben Franklin. (Wikipedia)

One of my favorite Youtube channels is Wind Hippie Sailing. The Wind Hippie is a free spirit who does not worry about little details like currents and wind – this is the charm of her channel.  While sailing from North Carolina to Puerto Rico, she soon learned the power of the Gulf Stream when she discovered that she was making very slow (maybe negative) progress. Fortunately, she was able to get out of that situation. I was worried she might end up somewhere dangerous, like a reef.

Her predicament reminded me of a story that Sister Mary Agnes told our fifth-grade class about Ben Franklin (Figure 1). She told us that Ben was the first person to publish a chart of the Gulf Stream. Ben's Gulf Stream story began during a visit to England where he heard tales of ships taking much longer to cross from the UK to the US than from the US to the UK. He became curious and upon his return to the US contacted a US whaling captain, Timothy Folger, who was also his distant cousin. Whalers were famous for collecting all sorts of ocean data. Captain Folger confirmed what Ben had heard in England and told Ben what the whalers knew about the Atlantic currents. He showed Ben the path of the current and even how to avoid it. Ben and Captain Folger put together a chart that Ben sent to a contact in England, who then printed it.

I remember as a young boy being impressed by Ben's curiosity and approach – I still am. For those who want to know more about the Franklin-Folger chart of the Gulf stream, check out this document.

Posted in History of Science and Technology, Naval History, Navigation | 4 Comments

US Daylight Saving Time Date Calculation in Excel

 

Quote of the Day

There is a wonderful, almost mystical, law of nature that says three of the things we want most—happiness, freedom, and peace of mind—are always attained when we give them to others. Give it away to get it back.

— Basketball coach John Wooden


Introduction

Figure 1: US Daylight Savings Time Rule. The clocks change at 2 AM Sunday morning. (Source: Me)

Figure 1: US Daylight Saving Time Rule: Spring Ahead, Fall Back. The clocks change at 2 AM Sunday morning. (Source: Me)

I recently had a situation where I needed to correct a number of date/time values because they did not take into account Daylight Saving Time (DST). To be specific, some transactions from China were recorded assuming a fixed time offset with respect to US Central Standard Time. Because of DST, this is not always the case. My customer only works in Excel, so the work was done in Excel.

To perform this correction, I need to know if the date/time values occurred during the DST date ranges for those years. As discussed below, the US adopted its current DST assignments in 2005.  I only need to calculate the DST correction for dates after 2005, so I will not worry about earlier years.

Background

Date Definitions

The 2005 Energy Policy Act defined the dates of DST to be:

  • Start DST: Second Sunday of March.
  • End DST: First Sunday in November.

Excel Dates

Solving this problem requires two Excel date functions.

DATE(year, month, date)
This function computes an Excel date object, which is an integer equal to the number of days since January 1st, 1900.
WEEKDAY(date object)
This function takes a date object and computes an integer that corresponds to the day of the week, where the default numbering is Monday = 1, Tuesday = 2, … , Sunday = 7.

Analysis

Date Calculation Approach

I will illustrate the calculation using the 2021 Spring DST date calculation.

  • Create a date object for the latest possible spring DST date (14-March-2021)
    • If March 1st is a Monday, the second Sunday in March is on the 14th
    • Excel Calculation: latest = DATE(2021, 3, 14)
  • Compute the weekday of 14-March-2021 from the previous Sunday
    • This is simply the weekday number of 14-March-2021  minus 1
    • offset = WEEKDAY(latest) - 1  (Sunday = 0, Monday = 1, …, Saturday = 6)
  • SpringDST = latest -offset = DATE(2021, 3, 14) - (WEEKDAY(DATE(2021,3,14))-1)
    =14-Mar-2021

The same logic can be used to give us the Fall DST date by substituting in the lastest possible first Sunday in November: DATE(2021, 11, 7) - (WEEKDAY(DATE(2021,11,7)-1) = 7-Nov-2021.

Excel Workbook Example

My Excel workbook can be found here. I used it to create the following tables (Figures 2 and 3) of Spring and Fall DST dates, which agree with a similar online table.

Figure 2: Spring DST Dates.

Figure 2: Spring DST Dates.

Figure 3: Fall DST Dates.

Figure 3: Fall DST Dates.

Posted in Excel | 3 Comments

Taffy 3 Total Displacement vs Yamato Using R

 

Quote of the Day

Rome wasn’t built in a day, but they were laying bricks every hour. You don’t have to do it all today. Just lay a brick.

James Clear.


IJN Yamato

Figure 1: IJN Yamato, arguably the most powerful battleship of WW2 (along with its sister Musashi).

I was watching a documentary about the Battle Off Samar on my favorite Youtube naval channel called Drachinifel. During this show, Drachinifel stated that the battleship Yamato displaced more tonnage than the entire Task Group 77.4.3 (call sign Taffy 3) that it was attacking. I found this a remarkable statement and one that I could verify using a little bit of web scraping. Because one of the students I tutor use R, I thought this would be a good exercise to implement using R and Rmarkdown (a great report generation tool).

Background

Battle Details

On 25-Oct-1944, a large Japanese surface fleet consisting of four battleships and six heavy cruisers, two light cruisers, and eleven destroyers attacked a greatly overmatched US force that was protecting a Philippine beachhead. Remarkably, the US force was able to turn back the attack with a defense no less remarkable than David versus Goliath. For example, you cannot read about the USS Johnston and fail to be in awe of the actions of that ship.

The best way to learn about the Battle off Samar is to watch the Drachinifel video shown below. His research coupled with a dry sense of humor makes the viewing enjoyable.

Figure 2: Drachinifel Video on the Battle of Samar.

Web Scraping

Here is the process I used to sum the displacements of the Taffy 3 ships:

  • Gather the list of Taffy 3 ships from US Naval History and Heritage Command
  • Create Wikipedia links from the ship names.
    Wikipedia has a standard format for converting a ship name to a hyperlink, so this involved just concatenating the correct prefix.
  • Using the rvest package with R, grab the ship class and displacement data from Wikipedia.
  • Correct some obvious Wikipedia errors.
    As a person who has authored a number of Wikipedia pages, I know how hard it is to get all the details correct, especially for WW2 ships because sources often differ. In this case, the problems had to do with different authors using different ship displacement definitions. For those interested, Wikipedia normally uses standard displacement. Unfortunately, the editors do not always enforce that convention. I manually corrected the error. The errors were obvious because different ships of the same class had radically different displacements. For the corrections, I used data from the US Navy's History and Heritage Command.
  • Using the kable package, create a table of Taffy 3 ships, their individual displacements, and their total displacement.

I am including my R file and a figure in this zip file for those who like to follow along. I include a PDF of my Rmarkdown file here.

Analysis

Table 2 shows a screenshot of the key result from this web scraping exercise. The total displacement of Taffy 3 is just under 61K long tons, which is well under Yamato's 64K long tons displacement.

Figure M: Taffy 3 Ships, Classes, and Displacements.

Taffy 3 at Samar took very heavy casualties with five ships being sunk:

The Japanese Center Force also took heavy casualties with three heavy cruisers sunk.

Posted in History Through Spreadsheets, Naval History | 2 Comments