While peat regenerates, it does so slowly, so they might actually have burned off some measurable elevation (on average)
While peat regenerates, it does so slowly, so they might actually have burned off some measurable elevation (on average)
At first it looked to me like he quickly sat up and had half the ammunition within a half-kilometer radius sent his way, with a grenade for good measure. It makes sense that the grenade was already there when he got up.
That’s what major versions are for - breaking changes. Regardless, you should probably be able to fix this with some regex hackery. Something along the lines of
new_file_content = re.sub(r'(?<=\bprint)(\s+)(?!\()', '(', old_file_content)
new_file_content = re.sub(r'(print\(.*?)(\n|$)', r'\1)', new_file_content)
should do the trick.
For someone starting out, I would say that a major advantage of Python over any compiled language is that you can just create a file and start writing/running code. With C++ (which I’m also a heavy user of) you need to get over the hurdle of setting up a build system, which is simple enough when you know it, but can quickly be a high bar for an absolute beginner. That’s before you start looking at things like including/linking other libraries, which in Python is done with a simple import
, but where you have to set up your build system properly to get things working in C++.
Honestly, I’m still kind of confused that the beginner course at my old university still insists on giving out a pre-written makefile and vscode config files for everyone instead of spending the first week just showing people how to actually write and compile hello world using cmake
. I remember my major hurdle when leaving that course was that I knew how to write basic C++, I just had no idea how to compile and link it when I could no longer use the makefile that we were explicitly told to never touch…
I would also like to know what the slur is
Idk why you guys are so passionate about this whole rounding thing? Rounding off 107 to 100 doesn’t change the information, only the precision. It’s not easier to interpret 200 than 212 or anything?
If you want quick conversion, just
F ≈ 2 * C + 30
Centi = 1e-2, deci = 1e-1
Regards,
Non-American
What does that have to do with non-Euclidean geometry?
Yes, it’s a field. Specifically, a field containing human-readable information about what is going on in adjacent fields, much like a comment. I see no issue with putting such information in a json file.
As for “you don’t comment by putting information in variables”: In Python, your objects have the __doc__
attribute, which is specifically used for this purpose.
I never understood that. Apparently they use it as a primary way of messaging each other? At least that’s what younger relatives have told me. I’ve tried to have them explain what makes the app designed to hide/delete stuff after it’s been read better for communication, but so far haven’t gotten an explanation I could make any sense of.
“Enshittification will continue until revenue improves”
I’ve found that regex is maybe the programming-related thing GPT is best at, which makes sense given that it’s a language model, and regex is just a compact language with weird syntax for describing patterns. Translating between a description of a pattern in English and Regex shouldn’t be harder for that kind of model than any other translation so to speak.
In general I agree: ChatGPT sucks at writing code. However, when I want to throw together some simple stuff in a language I rarely write, I find it can save me quite some time. Typical examples would be something like
“Write a bash script to rename all the files in the current directory according to <pattern>”, “Give me a regex pattern for <…>”, or “write a JavaScript function to do <stupid simple thing, but I never bothered to learn JS>”
Especially using it as a regex pattern generator is nice. It can also be nice when learning a new language and you just need to check the syntax for something- often quicker than swimming though some Geeks4Geeks blog about why you should know how to do what you’re trying to do.
My test suite takes quite a bit of time, not because the code base is huge, but because it consists of a variety of mathematical models that should work under a range of conditions.
This makes it very quick to write a test that’s basically “check that every pair of models gives the same output for the same conditions” or “check that re-ordering the inputs in a certain way does not change the output”.
If you have 10 models, with three inputs that can be ordered 6 ways, you now suddenly have 60 tests that take maybe 2-3 sec each.
Scaling up: It becomes very easy to write automated testing for a lot of stuff, so even if each individual test is relatively quick, they suddenly take 10-15 min to run total.
The test suite now is ≈2000 unit/integration tests, and I have experienced uncovering an obscure bug because a single one of them failed.
Ngl you had me until the 1772 bit
First of all, that speech is awesome.
But I want to comment on something regarding modding, and ask an honest question: Shouldn’t reiteration of historical speeches or texts be omitted from rules about slurs? I mean, reiterating a speech, or a section of Huckleberry Finn, is obviously not the same thing as devaluing someone by calling them a slur. We actually have a quite hot debate going on in my country about this now, where some teachers were harassed for “being racist”, because in class they read aloud a famous poem written by an immigrant about racism, where he writes some of the things that were shouted at him. The whole point of the poem, and of reading it in class, is of course to make a point out of how bad racism is, and to educate about racism. Still, these teachers have been stamped as “racists” because they reiterated specific words in the poem.
For the honest question (I’m not American or a native english speaker): Isn’t there a historical difference between the word “Negro”, and a certain similar word I’ll refrain from reiterating? The way I’ve understood it, the former is a historically more neutral form, that was simply used the way we today would use “black person”, while the latter has more or less always had some kind of devaluating undertone. I’ve gotten that interpretation, among other things, from having read speeches where people are promoting equal rights, and use “Negro” to refer to black people, while clearly not believing that they are inferior in any way (hence the promotion of equal rights). Of course, today, both words are considered unacceptable, but I would like to clarify if I’ve misunderstood, as it helps in interpreting things that were said or written in the past.
I would love to see Harris just stop for a second, turn over towards Trump, and say something like “Your mic is turned off you know, could you stop yelling for a moment?”, and have the cameras cut to a silent video of Trump furiously yelling at his turned-off mic.
This is a very “yes but still no” thing in my experience. Typically, I find that if I write “naive” C++ code, where I make no effort to optimise anything, I’ll outperform python code that I’ve spent time optimising by a factor of 10-30 (given that the code is reasonably complex, this obviously isn’t true for a simple matrix-multiplication where you can use numpy). If I spend some time on optimisation, I’ll typically be outperforming python by a factor of 50+.
In the end, I’ve found it’s mostly about what kind of data structures you’re working with, and how you’re passing them around. If you’re primarily working with arrays of some sort and doing simple math with them, using some numpy
and scipy
magic can get you speeds that will beat naive C++ code. On the other hand, when you have custom data structures that you want to avoid unnecessarily copying, just rewriting the exact same code in C++ and passing things by reference can give you massive speedups.
When I choose C++ over python, it’s not only because of speed. It’s also because I want a more explicitly typed language (which is easier to maintain), overloaded functions, and to actually know the memory layout of what I’m working with to some degree.
I would say “debunked” in the sense that quantum mechanics correctly predicts phenomena that don’t exist in classical physics, and relies on the idea that quantum particles obey a probability distribution, rather than deterministic mechanics.
Quantum mechanics appears to work so well for these phenomena compared to deterministic mechanics that it’s tempting to say that the actual universe is in fact governed by probabilities rather than determinism.
I would argue that all physical models of the universe are just that: Models. We can get asymptotically closer to a perfect description of the universe, but no model can ever tell us the true nature of the underlying system it is describing, just be an arbitrarily good description of it.