Some recent FaceBook postings brought up an old notion that's been something of a hot button issue for me. In this case, it was a comment by an IT guy about how stupid the people were who s/he was supposed to be helping. I understand perfectly how frustrating it might be to see computer-challenged people doing what are really dumb things - like not trying a reboot before calling for the IT folks, or forgetting to plug something in, or not backing up important files. I can see that might motivate such disregard. However, IT folks are a type of administrative support for the workers in the organization. They don't actually do the productive work of their organization - their job is to help the productive people who do the actual work for which the organization exists in the first place.
Is there anyone who hasn't been frustrated with some snarky bureaucrat who rejects your applications because you're not filling out the paperwork correctly, or who stonewalls you in your efforts to do something because there's some sort of a rule against it? What they do is give you all the reasons why you can't do something, rather than offering to work with you to find a way you can accomplish your intentions. This shows an obvious contempt for the people who come to them for help. From small businesses to giant corporations and massive government agencies, the presence of the bored, sarcastic, sneering admin type is a near universal. Not everyone in such positions is that way, of course, but a widespread culture of contempt exists in the workplace (and elsewhere), at least in my experience, and I know many others have experienced it.
When I was in the Naval Reserve, my monthly drill weekend job was to work in the training office. Our office existed to help all the sailors in the unit set up their training plans - mostly arranging for their annual (usually in the summer) period of "active duty for training" and obtaining the materials for the correspondence courses they needed for rank advancement. I had no prior experience or training for this job (it was all OJT), so I found myself emulating the contempt for their customers of the other guys in the training office, rejecting their paperwork because they hadn't filled it out correctly and so on. When our division officer (a very good officer and a friend) got wind of what we were doing, he called us all in and proceeded to read us the riot act about our attitudes. I was filled with shame over what I'd been doing. How many times had I been on the other side and encountered that contempt from some officious prick of a bureaucrat? Needless to say, I (and the other guys in the training office) changed my ways and tried to be as helpful as possible. The other sailors didn't know our job very well, of course, so it became our job to help them be successful in their goals, no matter how silly their efforts might seem to us.
One more anecdote in a related vein: when I was a grad student, I wrote a computer program that, because I was such an incompetent coder, took about 12 hours to run on the machine we had at the time. [Later, I was able to change the code to make it at least 10 times faster, but that's another story.] The students who were the computer operators asked me to stand by while the job was running for a while in the evening, so if something happened early and the job crashed, I might be able to fix my mistakes and get the job done that night. So I had a lot of time to sit and chat with the student operators. One night they told me about one of the senior scientists in the lab was always unsatisfied with what the operators did, complaining to their supervisor about them all the time, often for things that weren't actually their fault. So, naturally, the students found many creative ways to sabotage his jobs! Tit for tat, baby! Considering I could see for myself that the students were doing everything they could to make my project successful, I was horrified to hear about some self-centered scientist who treated them so badly. I made it a point to mention how much I appreciated their efforts when I talked with their supervisor, in fact. It makes no sense to create unnecessary trouble for people you depend on to get your projects done. If for no other reason than self-interest (to say nothing about being a decent human being), you should always treat your "subordinates" with respect because your success depends on their help. So the contempt culture can work both ways.
Very few of us work in a vacuum. Virtually all of us depend on others to get our work done (right down to the maintenance staff), or we serve others who need our help. There's no good reason to treat others with undeserved disrespect. Every person in an organization has a job to do that is important to that organization. Otherwise that job wouldn't exist. Why treat co-workers with routine contempt? There's no valid reason to feel a sense of superiority associated with you and your work, as you look down at whatever anyone else does. This sort of disregard for the work of others is unfortunately pretty widespread. Personally, I find that if I treat everyone as an important part of the work we all are trying to do, it pays dividends for me in many ways. Furthermore, it simply feels good and seems to be the right way to behave. You only get respect if you give it!
Saturday, November 7, 2015
Wednesday, November 4, 2015
Strange bedfellows? - Meteorology and Social Science
I recently gave a talk at the National Weather Association's Annual Meeting in OKC on the role for social science in the weather forecasting business. This is something I've long been saying is needed, as a result of my long friendship with the late Al Moller. It was Al who first made it clear to me that just putting out a good forecast is only the start of a chain of events that must take place if the forecast is to be of any value to the users. By the way, I should give credit to the late Allan Murphy for his insight that the value of a forecast is never determined by the forecaster, but by the user of the forecast.
Anyway, I've presented this concept in several talks, usually with a tweak or two based on my most recent thoughts regarding the topic. For a forecast to be effective, it must be:
1. Honest
2. Accurate
3. Received
4. Understood
5. Believed
6. Helpful in making a decision
Only the first two items are under the control of forecasters. Once a forecaster transmits a forecast, the rest of the steps depend on others. Forecasters are paid to forecast, not to do all these other things. Meteorology is what they are educated and trained to do. We shouldn't expect them to become social scientists, as well! Moreover, most forecasters know little or nothing about how to go about helping the users make good decisions. Note that most users have to account for many other factors that go into making a decision besides the weather, and most forecasters know little or nothing about those other factors most of the time. Weather information is just a part of the decision-making process. The thing is that progress in forecasting is written in to the system, so while this blog is mostly about social sciences, the meteorology will be forging ahead, without doubt.
The existing watch-warning system, which began to take form in the mid-1960s, has saved many lives. It's not a "broken" system, despite the fact that it likely is far from perfect. I think I can justifiably assert that no one actually knows just how effective it is, but there are obvious declines in fatality and injury rates that seem to justify the existing system. If we intend to change it, let's follow the famous dictum (often erroneously attributed to the Hippocratic Oath): First, do no harm! Don't make stupid bureaucratic decisions in haste, just for the sake of doing something, without first having a clear picture of the shortcomings of the existing system and having a proposed change that has been given something like peer review - some sort of vetting process that includes participation by both forecasters and users, who after all are the ones with the most at stake in any changes.
We meteorologists need help from social scientists: experts in communications, psychology, economics, etc., where we have no expertise. We need participation from them so that any studies that review the current situation and any proposals for change will take human factors properly into account. We need surveys of what the public knows and actually does under the current system, over a very broad spectrum of users, since "the public" is pretty far from a monolithic block. Ideas for changes need to be given thorough testing to see if they actual improve upon the existing system by helping users make their own decisions. We do not necessarily need to be telling users what to do!! What we really need is to learn how to make our products more effective at helping users make good decisions with the information we can realistically provide for them (including uncertainty information!)
Although the movement to get social science into meteorology has been percolating for quite some time, and has become something of an "in" thing to advocate, what's been absent is much real collaboration between meteorologists and social scientists to produce actionable results. We don't need more conferences, workshops, and other "feel good" exercises. We need folks to roll up their sleeves and start getting some useful results to provide a scientific basis upon which to move forward. No more kumbaya songs around the campfire, please. Let's make something real and substantive, not just endless palaver. When we make changes, they need to be tested to make sure they're doing what we wanted them to do. The process should be one of never-ending evaluation and revision (see below).
If social science is to have a role in weather forecasting, and I think it should, then what might it look like? Does it make sense to have a token social scientist in every forecast office? I think that makes little sense. What about a Social Science Center, comparable to the Storm Prediction Center? I think the NWS will have heartburn in setting up something like this. What do they know about the skills needed to make it be effective? How would they know to pick the right people and what resources to provide for them? I think the best path for integrating social science into weather forecasting is for the NWS to have a budget that includes the funding specifically designated to support ad hoc collaborative efforts with social scientists, who would stay with their own institutions but form working partnerships with meteorologists to answer specific questions. This gives the most flexibility and avoids the creation of isolated "lone wolves" or some bureaucratic agency that would be a stranger in the strange land of weather forecasting.
Further, the NWS needs to understand this effort is not some sort of one-time project. It must be a continuous process, because the social, cultural, technological, and meteorological landscapes are constantly changing. We shouldn't have to wait decades for it to become painfully clear the system needs to evolve in the face of change. However, it will take some time to gather data about what works and what doesn't work with the current system, and even more time to develop and test proposed changes. Finally, before implementing anything, there should be an extensive public education campaign to help users understand the forthcoming changes. Let's not make the same mistake we made when probability of precipitation (PoP) was implemented in the mid-1960s. We're still paying the price in credibility for that one!
If this is done right, it will be a boon to weather forecasting and our whole society will reap the benefits. Please, let's not screw it up this time!
Anyway, I've presented this concept in several talks, usually with a tweak or two based on my most recent thoughts regarding the topic. For a forecast to be effective, it must be:
1. Honest
2. Accurate
3. Received
4. Understood
5. Believed
6. Helpful in making a decision
Only the first two items are under the control of forecasters. Once a forecaster transmits a forecast, the rest of the steps depend on others. Forecasters are paid to forecast, not to do all these other things. Meteorology is what they are educated and trained to do. We shouldn't expect them to become social scientists, as well! Moreover, most forecasters know little or nothing about how to go about helping the users make good decisions. Note that most users have to account for many other factors that go into making a decision besides the weather, and most forecasters know little or nothing about those other factors most of the time. Weather information is just a part of the decision-making process. The thing is that progress in forecasting is written in to the system, so while this blog is mostly about social sciences, the meteorology will be forging ahead, without doubt.
The existing watch-warning system, which began to take form in the mid-1960s, has saved many lives. It's not a "broken" system, despite the fact that it likely is far from perfect. I think I can justifiably assert that no one actually knows just how effective it is, but there are obvious declines in fatality and injury rates that seem to justify the existing system. If we intend to change it, let's follow the famous dictum (often erroneously attributed to the Hippocratic Oath): First, do no harm! Don't make stupid bureaucratic decisions in haste, just for the sake of doing something, without first having a clear picture of the shortcomings of the existing system and having a proposed change that has been given something like peer review - some sort of vetting process that includes participation by both forecasters and users, who after all are the ones with the most at stake in any changes.
We meteorologists need help from social scientists: experts in communications, psychology, economics, etc., where we have no expertise. We need participation from them so that any studies that review the current situation and any proposals for change will take human factors properly into account. We need surveys of what the public knows and actually does under the current system, over a very broad spectrum of users, since "the public" is pretty far from a monolithic block. Ideas for changes need to be given thorough testing to see if they actual improve upon the existing system by helping users make their own decisions. We do not necessarily need to be telling users what to do!! What we really need is to learn how to make our products more effective at helping users make good decisions with the information we can realistically provide for them (including uncertainty information!)
Although the movement to get social science into meteorology has been percolating for quite some time, and has become something of an "in" thing to advocate, what's been absent is much real collaboration between meteorologists and social scientists to produce actionable results. We don't need more conferences, workshops, and other "feel good" exercises. We need folks to roll up their sleeves and start getting some useful results to provide a scientific basis upon which to move forward. No more kumbaya songs around the campfire, please. Let's make something real and substantive, not just endless palaver. When we make changes, they need to be tested to make sure they're doing what we wanted them to do. The process should be one of never-ending evaluation and revision (see below).
If social science is to have a role in weather forecasting, and I think it should, then what might it look like? Does it make sense to have a token social scientist in every forecast office? I think that makes little sense. What about a Social Science Center, comparable to the Storm Prediction Center? I think the NWS will have heartburn in setting up something like this. What do they know about the skills needed to make it be effective? How would they know to pick the right people and what resources to provide for them? I think the best path for integrating social science into weather forecasting is for the NWS to have a budget that includes the funding specifically designated to support ad hoc collaborative efforts with social scientists, who would stay with their own institutions but form working partnerships with meteorologists to answer specific questions. This gives the most flexibility and avoids the creation of isolated "lone wolves" or some bureaucratic agency that would be a stranger in the strange land of weather forecasting.
Further, the NWS needs to understand this effort is not some sort of one-time project. It must be a continuous process, because the social, cultural, technological, and meteorological landscapes are constantly changing. We shouldn't have to wait decades for it to become painfully clear the system needs to evolve in the face of change. However, it will take some time to gather data about what works and what doesn't work with the current system, and even more time to develop and test proposed changes. Finally, before implementing anything, there should be an extensive public education campaign to help users understand the forthcoming changes. Let's not make the same mistake we made when probability of precipitation (PoP) was implemented in the mid-1960s. We're still paying the price in credibility for that one!
If this is done right, it will be a boon to weather forecasting and our whole society will reap the benefits. Please, let's not screw it up this time!
Subscribe to:
Posts (Atom)