May is the month of flowers. Blooms burst in every color, painting landscapes in bright contrast to April’s gray skies. But here in Minnesota, we’re just as proud of another kind of flour — the kind that helped build an industry and shape a city.

In celebration of the season, we’re looking back at the history of flour milling — from its ancient roots to its peak in the Twin Cities, and where the industry stands today.

First Tools, First Grains

Humans started making tools nearly 250,000 years ago, but those early creations were mostly for hunting and survival. It wasn’t until around 10,000 to 15,000 years ago that we turned our focus to agriculture.

Grain, unlike meat or produce, was easier to store and transport. That made it perfect for trade — and perfect for early cities.

The trick was in the milling.

To make grain digestible, early societies learned to grind it using stones. Even 6,700 years ago, people were milling wheat between stones to remove the bran and germ, leaving the endosperm to become flour.

Early Innovations in Milling

  • Ancient Egyptians used saddle stones
  • Greeks developed hopper-fed “hourglass mills”
  • Romans introduced water power around 100 B.C.

Through the centuries, mills improved by harnessing new sources of energy — from humans and animals to windmills and waterwheels. Sifting systems became more advanced. By the 19th century, mills were adopting gears, belts, and roller systems to move grain faster and produce purer flour.

One key figure in this shift was American inventor Oliver Evans, who designed the first continuous milling system. His work introduced bucket elevators, screw conveyors, and sifters into a single seamless process — the first real automation of its kind.

Milling Moves to the Midwest

As the U.S. expanded westward, so did its agricultural and industrial base. With new rail lines, barge access, and cheap land for growing wheat, the center of U.S. flour production migrated west.

By the late 1800s, Minneapolis had all the ingredients to become the new flour capital:

  • Proximity to wheat-growing regions
  • Reliable river power
  • Rapid rail expansion
  • A workforce hungry for opportunity

At the same time, a “New Process” of milling was changing the game. It used Canadian hard wheat, milled slowly between wider-spaced stones, to produce better flour more efficiently.

Edmund La Croix and the Minnesota Advantage

One of the biggest breakthroughs in modern milling came from Minnesotan Edmund La Croix, who invented the middlings purifier in 1865.

His innovation separated the finest parts of the wheat more effectively, dramatically improving flour quality. It helped Minneapolis mills produce flour that could compete with — and beat — European brands in quality.

By 1870, the average mill could extract 72% flour from grain, compared to just 28% in millfeed. Milling had officially become one of the first fully automated industries.

The Rise of the “Mill City”

By 1880, Minneapolis had overtaken St. Louis as the nation’s top flour producer. In that year alone, the city produced 2 million barrels. By 1910, that number had climbed to 15.4 million barrels, earning Minneapolis the title “Flour-Milling Capital of the World.”

World War I drove even more demand. In 1916, Minneapolis mills produced 18.5 million barrels, more than 20% of all U.S. flour.

Three companies dominated:

  • Washburn-Crosby (Gold Medal Flour)
  • Pillsbury
  • Northwestern Consolidated Milling

Pillsbury’s “A” Mill was the largest in the world, producing 12,000 barrels per day.

By 1928, Washburn-Crosby had become General Mills, and in 2001, it acquired Pillsbury — uniting Minnesota’s two biggest flour producers under one roof.

Flour Today: Global Competition, Local Legacy

While Minneapolis is no longer the flour capital, its influence remains. The ruins of the original Washburn Mill, destroyed in an explosion in 1878, still stand today near the Mill City Museum, complete with the iconic Gold Medal Flour sign.

Globally, countries like China, India, and Russia now lead wheat production. The U.S. ranks fourth in milled flour exports, behind Turkey, Kazakhstan, and Germany.

Want More?

If this article gave you something to chew on, check out our post on how fireworks are made. Or watch this video to see modern flour production in action.

Got a question about how something is made? Send it to the FlexTrades Writing Team and we’ll cover it in a future blog. 

April is the month of showers — we all know they bring May flowers. But have you ever thought about the showers that keep us smelling fresh all year round?

Roughly two out of three Americans shower every day. But it hasn’t always been that way.

The history of the modern shower is long, winding, and surprisingly global. From waterfalls to water heaters, here’s how we got here.

From Rivers to Rome: The Origins of Showering

Early humans cleaned themselves in streams, waterfalls, rain, and any natural water source they could find. As communities formed, the systems evolved.

  • The ancient Egyptians created ceramic jugs to mimic the feel of cascading water
  • The Greeks developed piping systems to move water where it was needed
  • The Romans brought the concept of hygiene to the masses, building public bathhouses across their empire

When Rome fell, the infrastructure crumbled with it. Medieval Europe lost access to Roman engineering, and the public bathhouse culture disappeared in many places.

Despite popular belief, hygiene didn’t vanish during the Dark Ages — but the systems that supported it did.

The Invention of the Shower

Fast forward to the 18th century, when interest in personal hygiene came back into focus. In 1767, William Feetham, a London stove maker, patented what is recognized as the first modern shower.

It wasn’t perfect.

  • It pumped cold water to a basin overhead
  • It dumped reused water on the user’s head
  • It wasn’t exactly refreshing

But it was a start.

By 1810, inventors added heated water. By 1850, modern plumbing was back in action — solving the whole “recycled water” issue and setting the stage for what we now recognize as a real shower.

Showers Gain Popularity

Throughout the 19th and early 20th centuries, showers grew in popularity, especially in England and the U.S. But the bathtub still reigned supreme until the 1980s, when showers took over as the go-to option in most households.

That’s when the customization boom began. Shower heads, body jets, built-in lighting — all became part of a new era in home design. The growth hasn’t stopped since.

The Shower Industry Today

The global market for bath and shower products is now worth nearly $50 billion a year.

It’s driven by more than just hygiene. Today’s consumers care about:

  • Efficiency – modern showerheads use significantly less water than bathtubs
  • Sustainability – water-saving technologies and eco-conscious materials
  • Experience – from rainfall heads to digital temperature control

In fact, a 10-minute shower today can use up to four times less water than a typical bath. That means getting clean doesn’t have to mean wasting water.

Curious for More?

If this kind of thing interests you, check out our post on the history of foundries to see how another everyday process evolved. Or, for something a little more modern, watch this video on how showerheads and hoses are mass-produced today.

And remember, the next time a question hits you in the shower, we’d love to help answer it. Send your ideas to writingteam@flextrades.com and we just might feature it in a future blog. 

FlexTrades works with manufacturers of all kinds — from aerospace and automotive to food production. Some of our clients make frozen pizza. Others make snack cakes, breakfast foods, plant-based proteins, or prepared meals. The point is, we’re all pretty spoiled by the convenience of walking into a grocery store and grabbing whatever we want — frozen, fresh, or refrigerated.

But it wasn’t always like this.

Before the modern freezer, cold food storage meant digging holes in the ground, building underground cellars, or relying on blocks of lake ice stored in ice houses. The result? Slow freezing. That process formed large ice crystals, which caused food to become watery and tasteless once thawed.

Enter: Clarence Birdseye, the man who changed the game.

Clarence Birdseye: The Father of Frozen Foods

Clarence Birdseye got his start not in food, but in fur trading. While working in Canada, he noticed that fish caught by local Inuit froze instantly in the subzero air. Even months later, once thawed, the fish tasted just as fresh.

That moment of observation sparked a theory — fast freezing retains food’s texture and flavor better than slow freezing. Clarence tested his theory and proved it right, not once but twice.

Birdseye’s First Method: Calcium Chloride Brine

In his first innovation, Clarence developed a process using calcium chloride. Here’s how it worked:

  • Packaged food was placed between two metal belts
  • The belts were cooled to between -40°F and -45°F using a calcium chloride solution
  • The food froze almost instantly

This led to his first business — Birdseye Seafood — where he patented his process for freezing and storing fish.

His system included:

  • A refrigerating tank with calcium chloride brine
  • Containers to freeze fish fillets into solid blocks
  • Wax paper packaging for preservation
  • An insulated shipping container, later used in refrigerated railcars and grocery store display cases

Fun fact: Clarence also patented his refrigerated boxcar, laying the groundwork for modern cold-chain logistics.

From Bankruptcy to Breakthrough

Birdseye’s first venture went bankrupt. But he didn’t quit. He sold his and his wife’s life insurance policies and secured investment funding to launch again — this time with General Seafood Corporation in Gloucester, Massachusetts.

There, he developed a second freezing method, and this one stuck.

Birdseye’s Second Method: Ammonia and Innovation

This method used ammonia evaporation instead of calcium chloride. The process:

  • Packaged food was placed between hollow metal plates
  • Ammonia chilled the plates to between -25°F and -40°F
  • Fruits and vegetables froze to 0°F in 30 minutes, meats in 75 to 90 minutes

In 1929, Birdseye sold General Seafood Company — along with his fast-freezing patents — to Postum Cereal Company for $22 million (over $358 million today). Postum changed its name to General Foods Corporation and made Clarence president of its new Birds Eye Frosted Foods division.

Soon after, Birds Eye began rolling out frozen spinach, cherries, meats, and peas. That was just the beginning. Today, Birds Eye makes everything from frozen vegetables and sauced sides to full skillet meals and cauliflower wings.

An Inventor, a Naturalist, and a Relentless Innovator

Birdseye’s story began in Brooklyn in 1886. At age 10, he started his first business by trapping muskrats and selling them to a British lord. At Amherst College, he sold frogs to the Bronx Zoo to pay tuition. When that didn’t work out, he became a fur trader in Labrador and later worked as a naturalist for the U.S. government in the Arctic.

That’s where he got the idea that changed food manufacturing forever.

Through it all, Birdseye remained humble. His words say it best:

“I do not consider myself a remarkable person. I am just a guy with a very large bump of curiosity and a gambling instinct.”

Want to Learn More About Food Manufacturing?

Check out the FlexTrades blog for more How It’s Made stories — including articles on mystery flavored suckers, cheese, plant-based burgers, and even Spam

The art of metal casting — melting metal, pouring it into molds, and shaping it into usable forms — is as old as civilization itself. Archeologists have uncovered metal casting relics from as early as 300 BC, possibly even older, depending on who you ask.

Most of the oldest artifacts come from Mesopotamia, where early craftspeople used clay molds and fire pits to cast copper, gold, and silver. It was here that the first alloy — bronze, a mix of copper and tin — was born. That single innovation sparked a new era of metal tools, weapons, and technology.

But like every great invention, metal casting has evolved. And the reasons are as much about human progress as they are about science.

Why Metal Casting Changed Over Time

Two major shifts drove the evolution of foundries:

  • Humans stopped migrating and started settling, giving rise to cities, economies, and steady production
  • Mining technology improved, giving us access to more raw material in less time

The result? Foundries got bigger, smarter, and more influential — shaping everything from warfare to water systems.

Foundry Highlights from the 19th Century

By the 1800s, metal casting was more than craft. It was an industry. The 19th century brought several major advancements:

  • Open-hearth furnaces for higher-quality steel
  • Sandblasting to clean castings faster and more effectively
  • Gear-tilted ladles to pour molten metal more safely

This era helped drive the industrialization of the United States, with foundries fueling the construction of railroad tracks, ironclad warships, and even America’s first submarine, launched in 1881.

Breakthroughs in the 20th Century

The 1900s ushered in a wave of innovation:

  • The coreless electric induction furnace changed how we melt metal
  • Low-carbon stainless steel opened up new use cases
  • Foundries began serving defense, aerospace, HVAC, and automotive sectors

This century saw foundries expand rapidly across North America. They became central to U.S. manufacturing.

Fun Fact:
The American Foundry Society (AFS) first met in 1896, but its first student chapter wasn’t launched until 1907 — in Minnesota. That same year, a patent was issued for high-pressure die casting machinery, a technology still used today.

The Foundry Industry Today

Metal casting remains a cornerstone of manufacturing — just more advanced than ever.

Today, the U.S. foundry industry is worth over $33 billion, with close to 1,900 active foundries and nearly 200,000 workers. Metal castings are found in 90% of durable goods, from clean water systems and farm equipment to energy infrastructure and transportation components.

And the modern foundry? It’s high-tech.

Many now use:

  • CAD software for design
  • 3D printing for mold creation
  • Robotics and automation for efficiency
  • Casting analysis to improve quality and reduce waste

Foundries have never been more precise — or more important.

See It for Yourself

Want a closer look? Revisit our article on how steel is made and check out this factory tour of the St. Paul Foundry. You’ll see molten metal in action and the incredible technology that brings modern castings to life.

After that, look around. From the water pipes beneath your feet to the machine parts running your factory — metal castings are everywhere

 If you’ve felt like the phrase “supply chain” is everywhere lately, you’re not wrong. It’s become part of our daily language — in business meetings, in news reports, in casual conversations. And for good reason.

The supply chain affects everything. What we buy. What we can’t find. What costs more than it used to. But where did it come from? And how did it become one of the most essential forces behind modern manufacturing?

Let’s take a step back.

What It Was

Before the first industrial revolution, supply chains were simple. Life was local. People relied on what was grown, built, or traded nearby. Long-distance transportation wasn’t yet a part of life, and production was limited by geography.

That changed quickly with the arrival of industry. Each industrial revolution brought new tools, new technologies, and a dramatic increase in production — which meant we needed better ways to move and manage all those goods.

Transportation was the turning point.

Where It Went

Without transportation, there is no modern supply chain. The railroad changed everything. But it was the internal combustion engine that transformed it.

In the late 19th century, diesel engines and the invention of the semi-truck gave businesses new ways to move product. Around the same time, new tools for handling goods — including hand trucks and early forklift concepts — started to take shape.

Shipping containers at a port

As goods began moving more freely across long distances, we needed places to store them. Warehouses evolved. Storage buildings expanded. Pallets made vertical storage more efficient. And the forklift? It became the workhorse of the warehouse.

Simple as it sounds, these were major innovations that made modern logistics possible.

What Took It Further

World War II marked a shift.

Military supply needs drove innovation. We weren’t just managing goods anymore — we were engineering full-scale systems to track, deliver, and replenish materials across the globe.

From the 1930s through the 1970s, some of the most important supply chain advancements emerged:

  • New pallet systems and storage innovations
  • The invention of the shipping container in the 1950s
  • A growing shift from rail to trucks in the 1960s
  • IBM’s creation of a computerized inventory system in 1967
  • Real-time warehouse management systems (WMS), barcodes, and scanners in the 1970s

Modern warehouse technology

By the 1980s and 1990s, supply chain systems became more connected, more digital, and more global. In 1983, the term Supply Chain Management was officially born.

Computers, spreadsheets, networked distribution models — all of it came together to shape the supply chain into something far bigger than anyone expected. Suddenly, the world was within reach.

What’s Next

Today, the global supply chain is a living, breathing system. Goods are sourced from everywhere. Operations are monitored in real time. And artificial intelligence is used to forecast demand, manage orders, and analyze performance with a level of speed and precision that would’ve seemed impossible just a few decades ago.

This is the Internet of Things (IoT) era — and supply chains are more interconnected than ever.

What comes next? More innovation. More complexity. And more opportunity to solve hard problems with smart systems.

And that’s exactly the kind of work we do every day at FlexTrades. 

Manufacturing has evolved through four industrial revolutions, and with each shift, machines have played a central role in shaping how things get made. Among them, the metal working mill has quietly remained a constant.

It doesn’t get as much historical attention as the lathe, but the mill has earned its place — not just in factories, but in the foundation of modern production itself.

Eli Whitney and the Birth of the Mill

The story of the mill begins in the late 1700s, when clockmakers used crude versions to cut balance wheels. But it wasn’t until 1818 that the United States could truly call the milling machine its own. That credit goes to Eli Whitney.

You probably know Whitney as the inventor of the cotton gin. What you might not know is what came next.

Facing the threat of war with France, the U.S. government began offering contracts for mass musket production. At the time, muskets were handmade, and that meant each one was slightly different. No interchangeable parts. No inventory system. No consistency. Whitney saw the flaw. And he saw the fix.

He began designing machine tools that could create musket components with identical size, shape, and function. These tools would allow parts to be mass-produced, stored, and swapped. That vision led to the milling machine — and, more importantly, to the production system we still rely on today.

In 1801, he presented this system to President-elect Thomas Jefferson. Jefferson was impressed. And with that vote of confidence, Whitney began manufacturing arms with his new technology, eventually passing the business to his son in Hamden, Connecticut.

Evolution Through the Revolutions

Milling machines didn’t stop evolving with Eli Whitney.

In 1867, American engineer Joseph R. Brown debuted a universal milling machine at the Paris Exhibition. It was a leap forward in precision and capability. Then in 1936, Rudolph Bannow improved the design even further. He believed mills should offer more movement and access — allowing tools to approach a part from multiple angles with less manual repositioning.

Bannow’s invention became the Bridgeport milling machine. It was revolutionary. And even now, many U.S. manufacturers still use Bridgeports in their shops.

But as game-changing as the Bridgeport was, it’s still manual. And with the rise of automation, software, and digital control systems, milling machines have taken another leap — this time into CNC.

But that’s a story for another day. 

FlexTrades exists to solve problems for American manufacturers. That’s our purpose. But solving real problems means asking hard questions. And sometimes, answering them.

One of the questions we hear most often from job seekers and clients alike is this:

What is ITAR, and why should I care?

Let’s break it down.

Understanding ITAR

ITAR stands for International Traffic in Arms Regulations. These are federal laws enforced by the U.S. government. They exist to protect national security by controlling who has access to defense-related products, data, and services.

If you build, supply, or support anything tied to military use — directly or indirectly — you’re in ITAR territory. That includes primary manufacturers, vendors, and even subcontractors in the supply chain.

Being ITAR compliant isn’t just a title. It means implementing safeguards to restrict access to sensitive materials and systems. It means following strict rules to prevent foreign nationals from accessing defense-related information. It means recertifying every year.

And if your company operates across borders? It means navigating some serious complexity — across documentation, digital access, hiring practices, and more.

What ITAR Means for Job Seekers

If you’re applying for a position at an ITAR-compliant company, there’s one thing you’ll need: proof of ITAR eligibility.

That usually includes two forms of government-issued ID, with at least one containing your photo. Requirements vary, but the company will tell you exactly what to provide. If you’re curious, here’s an example list of acceptable documents used for defense conference access.

This isn’t red tape. It’s law. And it’s important.

Still Curious About ITAR?

Want a deeper dive? The U.S. government offers detailed guidance on ITAR — who it applies to, what it regulates, and how compliance is maintained.

And if you’re new to the FlexTrades blog, stick around. We’ve got more answers where this came from.

Want to Work With Us?

If you’re a recent tech school graduate, consider the FlexTrades ReTool Program. It’s designed to help people like you get real-world experience that leads to a real career.

Already have experience? Join our Talent Network to see current opportunities and get matched with the right projects.

Got a Question of Your Own?

We want to hear it. Send your questions to writingteam@flextrades.com. Who knows — maybe your question will inspire our next blog post. 

The history of manufacturing is an interesting one. While many people understand it through the lens of the four industrial revolutions, there is so much more to the story. One of the most enduring pieces of that story is the lathe—a machine tool that predates every industrial revolution and continues to evolve to this day.

The Ancient Origins of the Lathe

Archaeological digs show that lathes were in use as early as the 13th century BCE. Ancient Greek, Assyrian, and Egyptian woodworkers used early versions of the lathe, though it required two operators. One person would spin the piece of wood using a rope while the other shaped it with a cutting tool.

Even thousands of years ago, craftsmen were building the foundation of modern manufacturing.

Lathe Innovations in the Roman Era

The Romans and other early cultures in Northern Italy, China, and what is now Turkey made some key upgrades to the original lathe design. The biggest innovation? A foot pedal. When pressed, the pedal spun the workpiece, allowing a single operator to do the job.

Efficiency increased. Output grew. And so began a long journey of continuous improvement.

The First Industrial Revolution: Powered Turning

Fast forward to the early 19th century and the arrival of steam power. During the First Industrial Revolution, inventors found a way to attach steam engines and water wheels to lathes. This allowed the workpiece to spin much faster than before. With higher speeds came greater precision and the ability to produce more uniform parts.

The Second Industrial Revolution: Metal Takes the Stage

By the late 1800s, powered lathes featured electric motors and forged tooling. These upgrades allowed lathes to cut metal, not just wood. That development turned the lathe into one of the most versatile machine tools in history.

What was once a tool for craftsmen now became a cornerstone of industrial-scale production.

Industry 3.0 and 4.0: The Rise of the CNC Lathe

Every industrial revolution changed the lathe—and the third and fourth were no exception. As computers and automation became central to manufacturing, the lathe evolved once again into the CNC lathe.

CNC stands for computer numerically controlled. These machines are programmed to operate automatically with minimal intervention. They can execute precise cuts on complex parts at high speeds and with incredible consistency.

Want to dive deeper into how CNC machines work? Check out our article on CNC machinery here.

From Rope to Code

The story of the lathe is also the story of manufacturing. With each revolution came a new level of innovation. What started as a two-person wooden tool has become a computer-controlled system shaping the future of production. At FlexTrades, we believe in honoring that history while helping our technicians and clients stay prepared for what comes next.