Since I've been laid off from teaching a few places, I've gone back to tutoring students in Math, Stats, Chem, Physics, etc.
I've had lots of students come in for help with their intro to stats classes. It's abundantly obvious to me that their profs were pure mathematicians and are merely going through the motions of teaching stats. I've also had to modify how I teach my stats classes to help my students get better grades with their online hw sets.
All of my students are expected to have a TI-83 or better calculator. If they don't, I have 18 I can lend out to them. I bought them with my meager salary for about $10-15 each, used. I show them how to use their calculators to the fullest extent and give them test problems were they need those skills. However, I keep having to take time away from teaching useful things to explain to my students, both in class and tutoring, that the problems they see in the online hw and on the other profs tests assume its 1923, NOT 2023.
For example, yesterday, I had to discuss how to calculate confidence intervals with 4 students. The problem that came up each time was, Part A) Find the point estimate, B) Find the Margin of Error, C) Find the xx% confidence interval. The online problems HAD to be done in that order. Each time, I had to discuss the use of the Z-table or the (woefully inadequate) T-table in the back of the book. We had to discuss how when n1 + n2 > 30 or 45 or 60, they should just use the Z-value instead of a t-value. Then, we worked on online HW problems where the answers were either answered using typical software or a TI-83/84 calculator and NOT what the textbook said. Other times, we had to use the Z approximation.
Meanwhile, their calculators allow them to answer the same questions using the various confidence interval functions. But, they'd have to answer it as: A) What is the Point Estimate? B) What is the xx% confidence interval? c) What is the margin of error? Doing the problem in this order allows students to get through problem in seconds instead of minutes. There is a lot less frustration too.
Along with those issues, there are also the normal approximation methods for the Binomial and Poisson distributions. These are basic functions in a TI-83. No need for them anymore. But, in order for my students to get the points on their online HW, I have to waste class time and say, "Sometimes we have to assume it's 1923 and this is how we will answer these questions." Which almost always brings up the question, "Does <programming language/package> solve problems this way?" Since the answer is "NO", I then have to explain why they will be tested on, or given hw problems based upon, the approximation methods, and the exact values from the calculator/software. Which adds to their frustration AND increases the amount of time they have to waste on exams and doing hw.
There are dozens more things I could complain about too. But, for now, I think this is enough.
Anyone have any ideas and factually based good reasons why we insist students assume it's the 1920's?
Statistician, Chemist, HPC Abuser;-)