 You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more
  You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more
I've not had enough coffee today, so I asked ChatGPT to help me do some basic maths (ie calculate volume of a cylinder which is 38cm long and 18cm diameter). This is the response:
To calculate the volume of a cylindrical object, you can use the formula: V = π × r^2 × h, where r is the radius and h is the height (or length in this case) of the cylinder.
First, let's find the radius (r) by dividing the diameter (18 cm) by 2:
r = 18 cm / 2 = 9 cm
Next, we'll convert the length (38 cm) and the radius (9 cm) to meters for consistency in units:
Length (h) = 38 cm = 0.38 meters
Radius (r) = 9 cm = 0.09 meters
Now, we can calculate the volume (V) using the formula:
V = π × 0.09^2 × 0.38 ≈ 0.0386 cubic meters
Finally, to convert the volume to liters, remember that 1 cubic meter is equal to 1000 liters:
Volume ≈ 0.0386 cubic meters × 1000 ≈ 38.6 liters
So, the volume of the object is approximately 38.6 liters.
That seems wrong to me, and I calculate it at 9.669 litres. Who is right?
Even without doing the maths, I don't think an object of that size could hold 38 litres
ChatGPT is talking bollocks.
Even without doing the maths, you can tell that a small cylinder 18x38cm is no where near 38litres - that’s just nonsense.
I don't think it could hold 9 litres either.
My bet is on 0.38l
its 9litres
chatgpt has used (PI*r)^2 instead of PI*r^2
edit: no, that's close but not exactly what has gone wrong. As above/below the text is correct but the calculation is using 18 instead of 9 for some reason. I always think of a 10x10x10cm box holding 1litre
He's taken r as 18, not 9, doh.
Your right and as above, visualising a cylinder 18cm x 38cm wouldn't hold 38 litres.
Damn it! 20 seconds!
Im very much with you. Not quite 10litres, which feels about right for something 7" diameter and a bit over a foot long. So a couple of gallons, roughly speaking.
Normally, I'd accept ChatGPT results but even I realized 38 litres was way off!
V = π × 0.09^2 × 0.38 ≈ 0.0386 cubic
<span style="font-size: 0.8rem;">The inputs are correct and the equation is correct. This actual calculation is where it is wrong.</span>
do that sum in a calculator and you get 0.0097m^3 /9.7 litres
What's odd is it sets out the calculation correctly, glancing over, but manages to get the actual sum wrong, straightforward multiplication as it is. So Vlad is completely correct - 9.67 litres.
So that's it for chatgpt. OP I'm about to paste the minutes of a work meeting I'd like rendered as a set of Limericks please if that's okay?...
...I was trying out Bard and it gave me the wrong answer to something (tedious about timings for a trip) altering it when I pointed out flaws but still vastly wrong. Asked it why it had been giving me repeated wrong answers to get the great reply "because that's what I thought you wanted to hear." (And then I was told off by my wife who felt sorry for it.)
I am a science teacher and I know that I will have to put up with more and more of this slapdash laziness. My colleagues in the languages deptartments are already up against it with Google Translate. All departments have to deal with copy+paste on a daily basis. Teaching kids people that taking shortcuts is not always a good thing is a very diffifcult thing.
PS. BIDMAS and a quick bit of mental arithmatic give a value of about two gallons.
Quite a few examples of chat gtp getting calculations like this wrong. It's quite interesting and shows how some of the LLM works
ChatGPT is predictive text, nothing more. It's not a search engine, it is not intelligent, it just generates phrases by putting together words that are commonly found in close proximity.
If you believe that the stuff it comes out with has any factual validity, you deserve what you get. It's shocking that it's been so widely misunderstood and misused.
You might as well use predictive text on your phone.
It's a good day to have to go back in a few days and I don't know if you are going up north north of the hill and the sun and the other people were talking to the other about that and I think I have to do it for myself or a bit of the other one.
(That last bit was actually predictive text.)
Chatgpt got it wrong when asked what relation my daughter is to my cousin.
It initially said 2nd cousin, I then followed up by asking whether it's actually 1st cousin once removed and it agreed that it had been wrong, then gave a detailed answer about why I was right.
Doesn't instil confidence 😁
ChatGPT is predictive text, nothing more
It's pretty good predictive text. "Write a research protocol for x,y,z" "revise as an interventional/observational study" ... It's not that it does it brilliantly, but it does pretty well and in seconds not hours.
Like Google translate which may be bad for language learning (I've no idea), but is a pretty handy tool for making sense of documents in languages you don't know (that chatgpt could summarise in bullet points) etc etc.
Obv you triangulate all this stuff and it's your lookout if you're just passing on stuff you don't understand/can't verify (the OP is a good illustration).
Anyway I for one welcome our new overlords and am trying to make sure I use this stuff rather than get used by (but suspect I know how it's going to go...)
US lawyers have been censured by the Courts there for using AI to research legal precedent. When asked where the case State of Michigan v Sue, Grabbit and Run* could be referenced it replied using proper legal database names, but the lawyers didn't check further
*I might have made the case up, but the story is true
Anyway I for one welcome our new overlords and am trying to make sure I use this stuff rather than get used by (but suspect I know how it’s going to go…)
timba
Free Member
US lawyers have been censured by the Courts there for using AI to research legal precedent. When asked where the case State of Michigan v Sue, Grabbit and Run* could be referenced it replied using proper legal database names, but the lawyers didn’t check further*I might have made the case up, but the story is true
iirc the real story (which is no better than what you've written) is covered in episode 547 of Freakonomics Radio. Even if thats the wrong episode, it's a really good listen. "Satya Nadella's Intelligence Is Not Artificial" - really good discussion with the Microsoft CEO that certainly covers ChatGPT
you only have to look at the size of a 20L camelback to know GPTs answer is boolox.
I always think of a 10x10x10cm box holding 1litre
Always a useful argument against those demanding a return to Imperial Units - so if a box is 4"x4"x4", approximately what volume of water will it hold? 🙂
64 cubic inches.
Much swearing has just been shouted.
Have spent the last week working through first order modeling and differentiating trig equations.
Thought I was on the right path yesterday but it's been a struggle.
Totally forgot about chat GPT and it's just broken down the method for me.
I was on the right path but going very slowly.
Well today work day will be to see if I can get this to work.
* I'm not great at maths but can apply methods to different task but don't really fully understand it unless I apply it to a problem/ project.