Used a couple of US recipes recently and most of the ingredients are in cups, or spoons, not by weight. This is a nightmare to convert. Do Americans not own scales or something? What’s the reason for measuring everything by volume?

          • Dagwood222@lemm.ee
            link
            fedilink
            arrow-up
            14
            arrow-down
            10
            ·
            7 months ago

            You do know that metric measures both volume and weight, right? A cubic centimeter of water weighs one gram.

            • ccunning@lemmy.world
              link
              fedilink
              arrow-up
              17
              arrow-down
              3
              ·
              7 months ago

              You do know that only water weighs on gram per ml, right?

              This is a great fact for if you’re trying to make hot water soup from a recipe written in metric volume measures and you only have a scale.

              You might get away if you’re just trying to measure apple juice or something else that’s mostly water, but good luck making Rice Krispie treats

            • morphballganon@lemmy.world
              link
              fedilink
              arrow-up
              4
              arrow-down
              1
              ·
              7 months ago

              You can still list an ingredient using one or the other on a recipe. It may be a simple conversion, but 1:1 is still a conversion.

            • tate@lemmy.sdf.org
              link
              fedilink
              arrow-up
              4
              arrow-down
              3
              ·
              7 months ago

              And one pint of water is one pound.

              You’ve completely missed the point, which is that most of the world measures ingredients (like flour for instance, where one pint is not one pound) by weight and not by volume.

                • morphballganon@lemmy.world
                  link
                  fedilink
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  7 months ago

                  In what widely-used context is a .04318 difference significant?

                  Not soup. Not bread.

                  I don’t think even concrete would suffer noticeably from that difference.

            • SchmidtGenetics@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              3
              ·
              7 months ago

              Canada uses a mixture of imperial and metric, but not weights, so that’s an entirely false conclusion you’ve come to.

              And that doesn’t help much, that’s only at sea level and a certain temperature, go do some baking with those exact conversions on a mountain and your cake won’t turn out at all.

    • inspxtr@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      7 months ago

      someone should make an alternate history tv show where the ship made it. bonus if it’s of a parody kind.

  • reddig33@lemmy.world
    link
    fedilink
    arrow-up
    65
    arrow-down
    2
    ·
    edit-2
    7 months ago

    Watch some cooking shows on YouTube where they cook from two hundred year old cookbooks. Weighing stuff is a modern thing. All the “ye olde recipes” from Europe and the colonies were done in cups, spoons, and some other volume measurements we don’t use anymore like “jills”. (If they even bother to specify meaurements.)

    • IndiBrony@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      1
      ·
      7 months ago

      “the” and “ye” are amusingly redundant next to each other.

      I was positively intrigued the day I learned “ye olde shoppe” is pronounced exactly the same way as “the old shop”.

    • MrsDoyle@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      Back in my childhood (60+ years ago) we had recipes that called for a “breakfast cup’ of this and a “teacup” of that. And yes, we did have actual breakfast cups and teacups, which had significantly different volumes. What kind of cup do they use in the US I wonder?

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Bartenders routinely measure mixed drink additives in “barspoons”.

      My grandmother in law has a biscuit recipe that starts with “fill the bowl with flour”. What bowl? The bowl she’s been making biscuits with for 50 years.

      Point is, people left to their own devices will use whatever measurement is handy.

  • pete_the_cat@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    3
    ·
    edit-2
    7 months ago

    The imperial system is a nightmare. A lot of us hate it and agree that metric is far easier. I grew up with the imperial system and still don’t know the conversions between quarts, pints, ounces, and cups. Blame the French and British, we got it from them!

    I’m currently calorie counting in order to lose weight and I weigh everything in grams because it’s easier.

    • IMALlama@lemmy.world
      link
      fedilink
      arrow-up
      30
      arrow-down
      1
      ·
      7 months ago

      This isn’t about imperial vs metric, it’s about measuring by mass vs volume. A good example here is flour. Weighing out 30 grams (or about 1 ounce) of flour will always result in the same amount. On the other hand, you can densely pack flour into a 1/4 cup measuring cup, you can gingerly spoon it in little by little, or you can scoop and level. When you do this you’ll get three different amounts of flour, even though they all fill that 1/4 cup. Good luck consistently measuring from scoop to scoop even if you use the same method for each scoop.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        15
        arrow-down
        1
        ·
        7 months ago

        Jokes on you. When we measure flour on the moon, it’s the same as on earth. You just don’t understand our advance measurement technique with your primitive weighing.

        • okamiueru@lemmy.world
          link
          fedilink
          arrow-up
          16
          ·
          edit-2
          7 months ago

          Joke aside, scales on earth measure force and show mass on the assumption of the gravitational pull on earth. On a moon colony, you’d use measuring scales with a different value for the gravitational pull, and get the same values for mass as on earth.

          Edit: Also, if anyone finds this stuff interesting to think about. You can measure mass without any force of gravity, but having the measuring device accelerate (e.g. shake) the stuff you want to measure. From “F=ma”, knowing F and a, you get m.

      • SendMePhotos@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Something something elevation and atmospheric pressure resulting in a proper measurement across altitude… Or something.

    • Treczoks@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      2
      ·
      7 months ago

      Blame the French and British, we got it from them!

      Like 99% of the world, the French and British long ago managed to overcome the imperial system. Actually, the French spearheaded the metric world.

      America just failed, time and again, to follow the times.

      • FleetingTit@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        7 months ago

        The british didn’t quite overcome the old ways of measuring. They still use miles, pints, stone, and so on.

        Companies just need to print the metric amount on the box as well.

        • letsgo@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          We like mixing them up too. Tyres are measured using three numbers, two of which are in mm, the other (wheel diameter) is in inches.

        • Treczoks@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          The british didn’t quite overcome the old ways of measuring.

          Not completely, agreed, but they are miles ahead of the Americans. :-)

    • remotelove@lemmy.ca
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      7 months ago

      I am converting my life to metric, actually. All of my CAD work is in metric and all of my chemistry glass is thankfully in metric. Thinking in longer distances is something I need to get used to though.

      The imperial system is just a waste of time, TBH. I am sure there are a ton of people that can work fractions in their head but I just gotta ask: Why, and what is the point?

      Measuring and planning with metric is just so damn easy and no extra steps are generally needed. When I need to convert 1000mm I just move the decimal over a bit and get 1km. EZ.

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        7 months ago

        I keep using this example: In the wood shop, I’m going to cut a bridle joint. Requires cutting boards into thirds of their thickness. Metric lumber is often milled to 19mm thick. What’s a third of 19mm? You want to show me which line means 6.3333mm on a metric tape measure? US Customary lumber is milled to 3/4" thick. What’s a third of 3/4"? You want to show me which line means 1/4" on an inch tape measure? Now let’s cut a half-lap joint in that same lumber. In metric that works out to 9.5mm, there’s also no line on a typical metric tape measure for that. But there is a line for 3/8".

        I’d much rather build furniture in inches than millimeters because in the wood shop I have to divide or multiply by powers of 2, 3 and 4 way more often than powers of 10. It is in this context that the inch standard which is subdivided by powers of two rather than ten arose, and it still works very well.

        Metric users often correctly accuse Imperial or US Customary (though the two share names of units they are not identical) users of making excuses or relying on workarounds, in the context of woodworking joinery I find it’s the reverse. “Of course we don’t use 6.3333mm, you just know to cut the cheeks 6mm and the tongue 7mm. 6+6+7 is 19.”

        I’ll grant you, doing stoichiometry in ounces and pounds would be a fucking nightmare. But woodworking joinery? Nah I’m doing that in fractional inches.

        • remotelove@lemmy.ca
          link
          fedilink
          arrow-up
          4
          ·
          7 months ago

          While neat, I believe your lumber example is misleading in the context of metric vs imperial. Woodworking is extremely imprecise compared to many other types of engineering and using that system for those problems may be ideal.

          Deliberately using 1/3rd of 19mm to get 6.33333mm is not as a complex problem as it may look at first glance. 6.333mm IS 1/3rd of 19mm just with more precision. The nature of woodworking requires fairly large tolerances and .3333mm is likely within any tolerance range you would work with. Hell, even +/-3.333mm (10x) is probably within spec in many cases.

          Your example conversion from 1/4in to 9.5mm is irrelevant unless you are working a project that is deliberately converting imperial to metric. If a project is designed in metric the measurements and reference points are going to be rounded to metric. The same goes for designs that are in imperial. While it’s possible to design identical pieces in each measuring system, it’s not ideal. Tolerance can compensate for most small differences and you will get two extremely similar pieces.

          From your standpoint, everything has been imperial and you make design choices around how imperial works. It just makes sense to you. Design conversions from imperial to metric won’t make any sense and the “natural math” of each system is lost. If you were raised on metric, the same situation would apply I suppose.

          You explained the biggest complaint of imperial as a positive: fractions. Pure math is just easier then fractions when working up and down ranges of precision. Divide 10cm by 2? 5mm. 5mm by 4? 1.25mm, etc… Problems like 19mm/3 are irrelevant because of allowable tolerance. Every exact measurement is not abstracted by a 16th or 8th or 32nd or 64th…

          Admittedly, I am no woodworker. However, I am curious if someone from the EU could chime in on this problem from their perspective.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            7 months ago

            I didn’t convert a quarter inch to 9.5mm. First of all a quarter inch is somewhere around 6.5mm. I divided 19mm by 2. Just like I divided 3/4" by two to get 3/8".

            I’m talking woodworking here, not carpentry. .3333mm is ~1/64", which is the maximum error I would allow in making a joint. I usually work well within that. Missing by 3.3333mm on a feature specified to be 6.3333mm is an abject failure. That’s an error of over 50%.

            PVA wood glue contracts as it dries, so it doesn’t fill gaps well. The looser the fit, the weaker the joint. I aim for a friction fit. Most of the time with glue on the mating surfaces the joint must be tapped together with a mallet.

            Your reading comprehension is what I’ve come to expect from Lemmy. I repeat myself for the slow kids at the back: because I often have to divide my work into halves, thirds or quarters and rarely into tenths, fractional inches in powers of two are more convenient in this application.

            Remember folks, in metric, 5/2=2.5, 5/3=doesn’t matter because “tolerances.” That’s one of those excuses I was talking about.

            Do keep trying to lecture me about something you don’t even slightly understand. It’s adorable.

            • remotelove@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              7 months ago

              Your reading comprehension is what I’ve come to expect from Lemmy.

              Chill. We both wrote walls of text and there are going to be misunderstood details. If we want to talk about details, I called out my ignorance of woodworking and why imperial is likely good for what you are talking about.

              My overall points, and I’ll summarize this time, is that:

              1. Wood working (carpentry? Whatever.) is not exact.

              2. Dividing 19mm by 3 is a weird example. Your example did a better job of highlighting a math peculiarity, TBH. (My first thought is that the cut was was going to account for any minor errors.)

              3. Fractions suck. You are comfortable with them, but I see them as a useless layer of an outdated measuring system. We made our points, for and against. Cool.

              4. A key point that I didn’t call out specifically is that imperial does not work in high degrees of precision easily without eliminating fractions. It’s possible, and vocalized, but not generally written. 1/1000" as a good example.

              While I was awaiting your reply, I also thought of the abuse the imperial system has suffered over the years. A 2x4 is not a 2x4. In reloading (another hobby of mine), .300 actually means .308. .223 could mean .222, .223 or even .224. However, .222 always means .222. I am forced into imperial for safety and consistency reasons. (Don’t even get me started on ‘grains’, wherever the fuck that came from.) For some reason, the metric system is now mixed up in that field as well and it’s a mess.

              The word “misleading” was chosen with purpose and doesn’t mean that you writing with malice. It seemed, true or not, that conversions got mixed up in this which would even confuse an MIT graduate.

              • Captain Aggravated@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                Wood working (carpentry? Whatever.) is not exact.

                Woodworking is the trade of building furniture and cabinetry, carpentry is the trade of building structures. A woodworker built your dining room table, a carpenter built your dining room. The two trades have some overlap in skills and knowledge but they are different skill sets. As for exactness…let me put it this way. A lot of woodworkers balk at using pencils to mark their cuts because the width of a 0.5mm mechanical pencil lead is considered too great a margin for error. A single bevel marking knife is used to apply a score mark with a perfectly sharp edge to the wood.

                Dividing 19mm by 3 is a weird example.

                It is typical for metric woodworkers to mill boards to a finished thickness of 19mm. It’s also a common thickness for plywood in metric land. I think it was chosen because it is close to 3/4". Many common woodworking joints such as mortise and tenons, bridle joints, etc. require dividing the board into thirds; a typical mortise for example is a rectangular hole or slot in a board 1/3rd the board’s thickness. So working in metric with 19mm stock, you either have to cope with measuring, marking, and cutting 6.3333mm, or having to “just know” to cut a 7mm mortise with 6mm of wood on either side.

                Meanwhile working in US Customary (which is not the same as Imperial) using wood milled to 3/4" stock, a third of 3/4" is 1/4".

                19mm/3 isn’t a weird example, I didn’t pull that number out of thin air.

                DIFFERENT SCENARIO: Now I’m going to make a half-lap joint which requires cutting the stock in half. Cutting a metric 19mm board in half gets you 9.5mm. Cutting a US Customary 3/4" thick board in half gets you 3/8".

                For this application, fractions genuinely don’t suck. There are advantages to using fractions in work like this, namely that you can do integer math on the numerator or demoninator rather than floating point arithmetic. Plus, measuring and marking tools being marked in powers of two rather than ten is more convenient in a field where most of what you’re doing is halving and quartering dimensions.

                Sure, precision metalwork is done in thousandths of an inch or even ten-thousandths of an inch, and I personally prefer machining in metric.

                A 2x4 is not a 2x4.

                Yes it is, for awhile at least. The board is rough sawn out of the log at 2" by 4" and dried at that dimension. Rough sawing doesn’t produce a perfect board, and the board will shrink and warp a little during drying, so the dried board is then further flattened, straightened and squared via a milling process which takes about a quarter inch from each face, resulting in a finished dimension of 1 1/2" by 3 1/2". Lumber is priced by their rough cut dimensions because that’s how much of the tree the sawyer had to use to make that board.

                Back in the day it was common for lumber yards to sell construction lumber in a rough cut state at a true 2" by 4", and the carpenter would mill it himself. Then the railroads happened, and lumber was being shipped from the forests of the West coast back east. Railroads charged for cargo by the ton, and lumber mills could save a mint on shipping by milling the boards to finished size before shipping. This saved carpenters the work of milling the boards themselves. They still called the boards “2x4s” because they were still used for the same purpose. An thus the modern commodity retail 2x4 was born.

                Similarly, that 3/4" lumber I keep saying I use: I buy that from my local sawyer rough sawn to 1" thick. I then plane it flat and straight, which takes about an eighth of an inch from each face. So I wind up with a finished board 3/4" thick, which as previously discussed is a convenient size for woodworking.

        • mryessir@lemmy.sdf.org
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          I have to admit: That sounds pretty nice. Next time I build something with wood I will try to use inches. Thanks!

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            7 months ago

            There’s a method to the madness. I’m pretty sure woodworking joinery is why inch fractions are the way they are. Lots of stuff I’d rather do in metric, machining for example, but specifically woodworking works very well in fractional inches.

        • Turun@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          I mean, you can make the exact same argument the other way round.

          My bed is made with boards of 27mm thickness. One third of that would be 9mm. Easy.

          Also if you need precision, calipers go down to 50um (micrometer), 1/20th of a mm.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            What’s half of 27mm?

            We can keep doing that dance, it’s possible to find similar inconveniences in the fractional inch system, like “what’s a third of a whole inch?” but I find that within each system’s conventions (like using 19 and 27mm stock versus 3/4" or 1 1/2" stock) you’re less likely to run into them working in fractional inches. I think because the wood shop is just a fractional kind of place, I divide by two and three out there a lot.

            The machine shop isn’t so much, which is why we tend to work in either thousandths of an inch or increasingly in metric. Most CNC machines will gladly accept both.

            As for calipers: In the wood shop, I frequently use a set of dial calipers calibrated in 64ths of an inch. Especially with my thickness planer on which one full turn of the handwheel moves the cutter head 1/16", so the major, medium and minor marks on the caliper dial work out to a full, half and quarter turn on the handwheel. The analog display makes the relationship between the calipers and the tool very intuitive in a way that improves accuracy and repeatability largely by decreasing error.

            I don’t really need precision beyond 1/64", but I do need to be able to tell if it’s a thin 64th or a fat 64th or a dead nuts 64th.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        I find specific situations where customary units are handy. Fahrenheit has a nicer range for precise cooking temperature, such as for sous vide. 1 degree centigrade is a wider range than 1 degree fahrenheit. Dropping down to milligrades is too precise. Fahrenheit is just right.

        Metric is lashed to orders of magnitude precision, and it gets in the way here. Being able to convert things, like knowing how much energy it takes to heat 1 cubic centimeter of water by 1 degree centigrade, also isn’t useful in the kitchen unless you’re doing some deep molecular gastronomy shit.

        It’s OK to use different measurement systems in different contexts. Purity is not a virtue.

        • remotelove@lemmy.ca
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          7 months ago

          Not really. Most every hardware shop has them these days. Amazon is about my only other source, but quality/usability is a gamble in the M1-M2 range for some reason. The number of small bolts and nuts in that range that are cast badly seems to be high for me. That seems really odd, actually.

          Kits are the way to go, usually. I have a full assortment of nuts and bolts from M1 up to about M6 at many different lengths. I started building a collection when I was modding 3D printers but use them for any other random project these days.

          Edit: Local hardware shops generally carry decent assortments from M3 and up. It’s more expensive than Amazon but is great if I only need one odd larger size for something random.

          • hglman@lemmy.ml
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            You cannot find a metric measuring tape in the US without a lot of effort.

  • Captain Aggravated@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    41
    ·
    7 months ago

    “Do Americans not own scales or something?”

    For a good long while, no they didn’t. For a large fraction of American history a typical home kitchen had no bespoke measuring equipment at all; but tea cups, tea spoons and table spoons were typically available and made to pretty similar sizes, plus if you always used the same ones the proportions would be roughly the same, so meh.

    A lot of traditional recipes were written this way, and it has remained so by tradition. A system of inexpensive, easy to manufacture measuring cups and spoons became standard equipment by the mid-20th century, and hasn’t changed to this day because it works just fine.

    The US government is in the habit of publishing recipes with a deliberately low minimum equipment list. The United States Department of Agriculture for example conducts extensive testing on home canning recipes and methods, and deliberately writes their recipes to be used in poorly equipped kitchens, because the kind of folks who rely on putting up home grown vegetables for the winter don’t tend to spend a lot of money on Sharper Image kitchen gadgetry. Flipping through my copy of the Ball Complete Book of Home Preserving, I find about a third of recipes could be made using nothing but a mason jar or two as your only measuring tool, as most mason jars (excluding the deliberately decorative ones) have graduation marks in cups, ounces and milliliters molded into them.

  • fitgse@sh.itjust.works
    link
    fedilink
    arrow-up
    39
    arrow-down
    1
    ·
    7 months ago

    Most Americans I know don’t even have a scale in their kitchen!

    I (an American) always wonder what a cup of spinach is. Like I can really pack it into a cup or not and there is a huge difference.

      • Delta_V@lemmy.world
        link
        fedilink
        arrow-up
        29
        arrow-down
        1
        ·
        7 months ago

        The things people drink out of are many different sizes of course, but when the word “cup” is used in the context of a measure of volume, then yes, they’re called “measuring cups”, and the volume is standardized.

        Same thing with teaspoons and tablespoons. They’re not just any random spoon - when talking about measurements, they have a standardized volume and you need to use a cheap and ubiquitous measuring device if you want to follow a recipe precisely.

        Most people in USA do not have a scale in their kitchen, but we do have a measuring cup and a set of measuring spoons.

      • themoonisacheese@sh.itjust.works
        link
        fedilink
        arrow-up
        20
        arrow-down
        1
        ·
        7 months ago

        “cup” is a unit of measure like a foot. It measures volume and it is approx equal to 236 ml.

        There also exist metric cups with a round 250 ml, supposedly for easier adoption of the metric system.

      • ryven@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        7 months ago

        A measuring cup is a specific size, about 237mL. There’s a whole system of US measurements, actually:

        3 teaspoons in a tablespoon

        2 tablespoons in an ounce

        8 ounces in a cup

        2 cups in a pint

        2 pints in a quart

        4 quarts in a gallon

        Not all cups are measuring cups; if you are having a cup of coffee that doesn’t mean your cup is exactly 8oz. You just infer from context that if someone is talking about ingredients then you should measure them with a measuring cup. (Very commonly you also see cups with graduated markings, which are US Imperial on one side and metric on the other, that go up to 2 cups/500mL.)

        • Sinthesis@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          7 months ago

          fluid ounce, since most liquids used in food are nearly the same density.

          /edit to add to this, after a cup most things that are dry are not measure in pints, quarts or gallons. For example, you don’t hear anyone say “you’ll need 1 pint of flour”, they’ll just say 4 cups.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        I’ve seen “cups” used to mean anywhere between 225ml and 250ml. It’s very confusing.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      10
      arrow-down
      30
      ·
      7 months ago

      If you’d use metric, then weight & measurements on measuring cups would be basically the same. Like, 1 liter or milk or water is exactly 1 Kg. Using arbitrary measurements like “cups” or “feet” are just confusing and prone to error.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        3
        ·
        7 months ago

        Milk has a specific gravity slightly higher than 1, so that isn’t accurate.

        Also, “cups” and “feet” aren’t arbitrary. They aren’t part of the metric system, but a cup is a standardized unit of volume and a foot is a standardized unit of length.

        • andrewta@lemmy.world
          link
          fedilink
          arrow-up
          15
          arrow-down
          2
          ·
          7 months ago

          Exactly. How is a foot anymore arbitrary then a meter?

          Or a cup anymore arbitrary then an ounce?

          • SchmidtGenetics@lemmy.world
            link
            fedilink
            arrow-up
            5
            arrow-down
            12
            ·
            7 months ago

            Imperial measurements were based on arbitrary things, metric has specific scientific definitions for their weights.

            1l of water is 1kg at sea level, why the fuck is kings foot size the defacto foot?

            • GamingChairModel@lemmy.world
              link
              fedilink
              arrow-up
              9
              ·
              7 months ago

              Imperial measurements were based on arbitrary things, metric has specific scientific definitions for their weights.

              What do you mean? A pound is legally defined as 0.45359237 kilograms.

              And the kilogram is defined:

              The kilogram, symbol kg, is the SI unit of mass. It is defined by taking the fixed numerical value of the Planck constant h to be 6.62607015×10^−34 when expressed in the unit J⋅s, which is equal to kg⋅m^2 ⋅s^−1, where the metre and the second are defined in terms of c and ΔνCs.

              These are all currently defined off of the same universal constants, just with different multipliers, which are all arbitrary numbers: 6.62607015 is just about as arbitrary as 0.45359237. Hell, the number 10 is arbitrary, too, so we still use a system for time based on dividing the Earth’s day into 24 and 60.

              Like, I get that there’s some elegance in the historical water-based definitions derived from the arbitrary definition of length, but the definition of “meter” started from about as arbitrary a definition as “foot” (and the meter was generally more difficult to derive in the time of its adoption based on the Earth’s dimensions).

            • DaDragon@kbin.social
              link
              fedilink
              arrow-up
              6
              arrow-down
              2
              ·
              7 months ago

              I’ll nitpick that said definition is also arbitrary. Why is it 1l of water at sea level, and not molecular weight of the water? And why a Liter anyway.

              Even metric units like time are somewhat arbitrary. Why is a second based on caesium frequency, and not some other element?

              • SchmidtGenetics@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                7
                ·
                edit-2
                7 months ago

                I’ll nitpick that said definition is also arbitrary. Why is it 1l of water at sea level, and not molecular weight of the water? And why a Liter anyway.

                Why? Because 1L is 1000 Cubic centimeters, which takes 1000 calories to raise 100 degrees to boiling point.

                Nothing is arbitrary with metric, everything is also directly related to every other measurement.

                • TheCannonball@lemmy.world
                  link
                  fedilink
                  arrow-up
                  7
                  ·
                  7 months ago

                  Because 1 Drakon is 1000 Cubic 100tholians, which takes 1000 Vornies to raise 100 degrees on the Flugar scale to boiling point.

                  Metric is very scientific, but it was made through arbitrary means. They chose to make it easier than imperial by using divisions of 10. But it’s all based on a single measurement that they made up through arbitrary means.

                  “We have this length called a meter. How do we define it? Let’s use it to measue something in nature and then use that measurement to define it.”

            • HobbitFoot @thelemmy.club
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              3
              ·
              7 months ago

              Until a few years ago, a kilogram was defined by a block of metal.

              From 1799 to 1960, the metre was defined by another block of metal. Before 1799, it was defined by a measurement that was hard to verify.

              That kind of sounds arbitrary.

              • SchmidtGenetics@lemmy.world
                link
                fedilink
                arrow-up
                4
                arrow-down
                7
                ·
                edit-2
                7 months ago

                On March 30, 1791, the French Academy of Sciences defined the length of a meter. Before this date, there were two definitions to this measure of length: The first was based on the length of a pendulum and the second was based on a fraction of the length of a half-meridian, or line of longitude. The French Academy chose the meridian definition. This defined one meter as one ten-millionth of the distance from the Equator to the North Pole.

                The meter is the basic unit of distance in the International System of Units (SI), the world’s standardized system of measurement. Since the 1960s, all countries have adopted or legally recognized the SI. As a universal standard of measure, the meter helped ease the exchange of commerce and scientific data.

                However, the definition of a meter has changed since 1791. In 1983, the meter got its current definition. The meter is defined as the length of the path travelled by light in a vacuumduring a time interval of 1/299,792,458 of a second.

                The meter was never to do with metal, and every metric definition is scientifically found, not based off of someone’s foot.

                • NoIWontPickAName@kbin.earth
                  link
                  fedilink
                  arrow-up
                  9
                  arrow-down
                  3
                  ·
                  7 months ago

                  You are way overthinking this.

                  Also, a foot is just a scientific as any other definition as long as you use the same foot every time.

                  Can you get me All of the things that I would need to Measure the speed of light in a vacuum, then do the math to divide all that?

                  Because that is what the average layman would need to verify what a meter is.

                • HobbitFoot @thelemmy.club
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  2
                  ·
                  7 months ago

                  A fraction of the Earth’s diameter isn’t a sound scientific reasoning to define a length. And after that, the definition reverted back to a similar definition of a foot, a fixed length of an item, similar to a foot.

                  The two main benefits of the metric system are the decimalized behavior of its units and that the scientific community adopted it early, creating additional units from the standard and allowing for greater precision of the initially defined units over time.

                  However, the value in the meter being its length is the same as everyone agreeing the Prime Meridian goes through Greenwich, UK; it is because everyone agrees to it.

        • Lvxferre@mander.xyz
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          Milk has a specific gravity slightly higher than 1, so that isn’t accurate.

          In this context milk is a bad example because the difference between 1.03g/ml and 1g/ml is negligible in a kitchen. Even oil (0.92g/ml) is close enough.

          This matters the most for stuff like below (with 1cup = 240ml):

          • honey: 340g/cup = 1.4g/ml
          • sugar: 200g/cup = 0.85g/ml [varies depending on granularity]
          • flour: 120g/cup = 0.5g/ml [sieved, and “properly” measured. It’s a PITA to measure it by volume.]

          Also, “cups” and “feet” aren’t arbitrary.

          All units are arbitrary, be them metric or esoteric.

          • HobbitFoot @thelemmy.club
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            7 months ago

            In this context milk is a bad example because the difference between 1.03g/ml and 1g/ml is negligible in a kitchen. Even oil (0.92g/ml) is close enough.

            The context is that if you are going to hand wave away a 3% difference in a quantity, then having to weigh everything probably isn’t important.

            • Lvxferre@mander.xyz
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              The context [SIC - rationale] is that if you are going to hand wave away a 3% difference in a quantity, then having to weigh everything probably isn’t important.

              That’s poor reasoning; ignoring a tiny difference doesn’t imply ignoring larger ones. Myself mentioned three cases where the difference matters, with one (flour) being highly variable.

              A better argument to defend your point would be that most differences in the kitchen are tiny.

              • HobbitFoot @thelemmy.club
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                I’ve been making that argument in other comments. If I had to argue the nuances of this argument in every comment, I’d be copying and pasting pages long comments that no one would read.

        • BlameThePeacock@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          It’s close enough for home cooking, the specific gravity of milk is around 1030g/L so unless your recipe calls for multiple liters of Milk the small difference isn’t going to affect the result.

          • HobbitFoot @thelemmy.club
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            7 months ago

            It’s close enough for home cooking

            And now you are getting to the reason why American use volume for recipes. If I don’t need the precision of mass for recipes as it won’t appreciably affect the taste, then why break out the scale?

              • NoIWontPickAName@kbin.earth
                link
                fedilink
                arrow-up
                5
                arrow-down
                1
                ·
                7 months ago

                It’s really mainly only flour though, because can be compacted, most of the things that you’re using in the kitchen like baking powder or sugar aren’t going to be compacted to any appreciable level.

                For flour, you pour it into your measuring cup and then run the spine of a knife or something over it to get rid of the excess flour and get a level cup

                • BlameThePeacock@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  7 months ago

                  There are many of other things that can be compacted or have different volume to weight ratios.

                  Corn starch is like flour, you can pack it down.

                  Salt (Table vs Kosher) Kosher salt has about half the volume to weight as table salt.

                  Shredded Cheese (this one always bugs me. Is it 3 cups after shredding, or before… how packed in should it be), etc.

              • HobbitFoot @thelemmy.club
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 months ago

                In my other responses, I’ve noted that I don’t bake. In other people’s responses, they’ve noted that there are still a lot of baking recipes out there that don’t require precision.

                • ryathal@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  7 months ago

                  Precision in baking is massively overstated. The earliest recipes are in parts if you’re lucky. More likely they are mix in these ingredients until it looks right.

          • SchmidtGenetics@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            3
            ·
            edit-2
            7 months ago

            Elevation changes everything though and if you don’t adjust the measurements change more.

            If you’re at sea level, sure.

      • teft@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        7 months ago

        1 liter of milk weighs more than 1 kilo. Milk is denser than water therefore 1 liter of it has to weigh more than water.

        Edit: I just looked it up and 1 liter of milk is 1.03 kilos.

        • tunetardis@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          5
          ·
          7 months ago

          Well since we’re nitpicking, a kilogram is a unit of mass, not weight. So unless by “kilo” you meant kilonewton…

      • tate@lemmy.sdf.org
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        Water isn’t the only ingredient. One liter of flour is not nearly one kilogram. More importantly, the mass of one liter of flour varies a lot depending on how much it settled in the container. That’s why weight is always the better way to measure ingredients.

  • scoobford@lemmy.zip
    link
    fedilink
    arrow-up
    27
    ·
    7 months ago

    Volumetric measurements, like the imperial system, is largely in place due to tradition.

    But no, most people do not own good food scales. They aren’t pricey (I think mine was $25), but they are very uncommon. I don’t think I’ve ever seen one in a store.

    • Chris@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      I’m amazed they are that uncommon. Here (UK) you can walk into a supermarket and pick them up for less than £20.

      • palebluethought@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        7 months ago

        “uncommon” is an overstatement, you can get them pretty much anywhere that has pots and pans. It’s uncommon in that most people don’t bother owning one, not that they’re hard to get

    • r0ertel@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      7 months ago

      I think you’re.right about tradition. I have a set of recipes from 3 generations ago. It’s been converted over the generations from a list of ingredients to “a fistful of flour” to “a juice glass of broth” to “1/3 cup of butter” as it was passed to me. Maybe my contribution will be to convert it to weight and pass it to my kids for them to finally convert it to metric weights.

  • esc27@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Cups, teaspoons, and tablespoons in this context are standardized units of measure. It is very common to find at least one set of measuring cups and spoons in a US kitchen. Scales are uncommon.

    I use both. For flour, scales are far, far superior. For sugar, it does not really seem to matter. For small amounts, I suspect my tea/tablespoons might be more accurate than my scale…

    Not that accuracy matters that much in a recipe using eggs. Chickens aren’t necessarily known for precision…

  • HobbitFoot @thelemmy.club
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    4
    ·
    7 months ago

    Do Americans not own scales or something?

    I do not own a kitchen scale. Outside of baking, volume works well enough.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        7 months ago

        It works well enough for me. Salt is relatively uniform, so there isn’t that much variance. With spices, the variance of the spice strength is greater than the variance caused by compaction.

        Outside of baking, the tolerance required to get a dish to taste good is rather wide.

        • echo64@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          5
          ·
          7 months ago

          Salf is the definition of not uniform.

          Try a spoonful of table salt instead of sea salt next time and see how well that goes. In grams it does not matter.

          • HobbitFoot @thelemmy.club
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            I have one kind of salt that I cook with. I know that if I use different kinds of salt, it can affect the flavor and concentration, so I just go with one kind of salt.

        • echo64@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          8
          ·
          7 months ago

          Oh okay well if you’ve not had an issue then it can’t be one.

          Honestly, what is wrong with the people left on lemmy, why is everyone like this. There was a few months there where you could talk and have a conversation. Then all the good people left and we just get… this.

          • ickplant@lemmy.world
            link
            fedilink
            arrow-up
            12
            ·
            7 months ago

            I know it’s easy to misread the tone of text, but I certainly didn’t mean to offend you. It’s true that it is a complete non-issue for me and *millions *of Americans. I’m simply stating a fact and in no way judging you or criticizing you for asking the question. I’m giving you an answer from my perspective.

          • chunkystyles@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            what is wrong with the people left on lemmy

            You’re the one causing problems here. The others are conversing normally. Lemmy is fine, chill.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Salt tends to be used in such small quantities that you’ll get much larger errors on the typical kitchen scale than with measuring spoons.

      • whoreticulture@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Never have I ever thought “oh it would be so much easier to pour my flour into a bowl on a scale” rather than just scoop out a few cups.

        • Solemn@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Unironically, I do in fact do this all the time. I make large batches when I bake, so it’s easier to just tare and measure everything directly in the stand mixer bowl instead of scooping 16 cups. It’s also less clean up afterwards!

  • Paraneoptera@sopuli.xyz
    link
    fedilink
    arrow-up
    21
    ·
    7 months ago

    I think it goes back to Fannie Farmer in 1896, who wrote the first major and comprehensive cookbook in English that used any kind of standard measurements. European cookbooks mostly used vague instructions without any standardized weights or numbers before that. At this point in the industrialized world standardized cup measures were relatively cheap and available. Scales were relatively bulky, expensive, and inaccurate in 1896. So the whole tradition got started, and most of the major cookbooks owed something to Fannie Farmer. Cookbooks that used standardized weights probably got started in other countries much later, when scales were becoming commonplace.

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    20
    ·
    7 months ago

    I’m not American but this is likely due to tradition. Recipe measures it in cups, you follow recipe, you get used to cups, then when writing your own recipe down you do it by cups.

  • Squirrel@thelemmy.club
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    7 months ago

    As an American who has recently learned to love his scale, I’m with you 100%. With that being said, no, many Americans do not have kitchen scales.

    • GiddyGap@lemm.ee
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      7 months ago

      Just another one of those things where the rest of world looks at the US and shakes its head. There seems to be a lot of things in the US purely in place based on tradition and logic goes out the window.

      • subtext@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        7 months ago

        But also, there’s no real incentive to change… my brownies taste just fine with a 1/3 cup of oil and a 1/3 cup of water. I am sure they would taste just as good with 80 g of each, but if it works, why change it?

        What logic is there in saying grams are better than cups of both work well for the intended task? If I were a professional baker, it’s entirely possible I would have a different opinion, but I (like 99% of Americans) am not.

        • Litron3000@feddit.de
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          Oil and water are fine, but flour already starts to be a problem. How densely is it packed?
          Then we go on to salt, which can have a lot of different grain sizes (although that is annoying with a scale as well because most kitchen scales are not very accurate with single-digit-grams)
          Then it gets really weird when they say to use a cup of grated cheese, because depending on how you grate it it has very different densities

          • subtext@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            7 months ago

            But what I’m saying is I’m plenty accurate enough with cups… there would be no appreciable difference for my box of brownies.

            • GiddyGap@lemm.ee
              link
              fedilink
              arrow-up
              1
              arrow-down
              3
              ·
              7 months ago

              You’re maybe plenty accurate for the brownies of your preference, but probably not for professional cooking or other activities that require accuracy.

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        7 months ago

        Why should I take an extra step to weigh everything out? Why should I give up some valuable counter space for a food scale? That’s just extra work for no reason.

        • Squirrel@thelemmy.club
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Precision. Volume varies by how tightly something is packed, how finely something is diced, etc. I’ve seen recipes that recommend spooning flour into the measuring cup to ensure it’s not packed in tightly, so you don’t use too much. How much simpler is it to just weigh it?

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            Unless you’re a professional chef it does not matter if you use 65 grams or 70 grams of something in a recipe. Makes zero difference.

  • ButWhatDoesItAllMean@sh.itjust.works
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    7 months ago

    American here but I do a lot of baking, I do own a scale and prefer to weigh ingredients because I’m amazed at the different quantities of flour I can get from cup to cup depending how packed the flour is or how I scoop it.

    • Chris@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      7 months ago

      Yeah, that’s the sort of thing which worries me. I suppose if it’s a recipe which doesn’t need precise measurements it doesn’t matter.

      • wjrii@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        7 months ago

        Baking can be troublesome, but it’s usually only flour that gets compacted to a problematic degree. Most good recipes will at least specify “sifted”. Otherwise, volume works about as well, and the cups and spoons will be standardized measurements with only a dim historical connection to the kind of insanity you may be be picturing.

        Mass would probably still be better of course, it’s just not quite the literal madness that some think.

        • TheBananaKing@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          There’s also volume measurements of things like butter and honey, which are their own nightmare. Yeah, I’m totally going to carve 17 3/4 tablespoons of butter…

    • Zess@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      General rule when measuring by volume is to not pack it unless the recipe says to.

  • EfreetSK@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    7 months ago

    I’m with you but I get it that sometimes it’s convenient. My wife likes what we call “cup recipes” in baking where everything is measured in cups/glasses (this was a new thing couple of years ago where I live). It’s very fast and convenient.

    But yes, it gets out of hand. I mean “a cup of celery”? … How? Why?

    • Halosheep@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      7 months ago

      I much prefer when they just estimate how many of the particular vegetable I should probably use. A cup of celery? Like 1-2 celeries?

    • jbrains@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      7 months ago

      Take a 1-cup measuring cup, chop celery until it’s full. That doesn’t sound difficult to me. I infer it’s merely not what you’re used to.

      I tend to prefer to weigh ingredients, but I also have measuring cups and spoons and using them is not so onerous. 🤷‍♂️

      • Strykker@programming.dev
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        7 months ago

        But celery is blocky and has gaps and doesn’t pack well, the amount you get changes drastically depending on how fine you chop it and on random packing.

        • jbrains@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          7 months ago

          I’m not arguing that it’s wise. I’m merely arguing that it’s not nearly as inexplicable as that comment made it seem.

        • Cryophilia@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago

          Why do you care about the tiny variations in volume? Recipe measurements very rarely need to be precise.