Nanoscientists and Ethics - Paper of the Week - June 16th

This week’s paper can be found here: Scientists’ Ethical Obligations and Social Responsibility for Nanotechnology Research

Note: Yes, I’m late this week. Normally, I try to have things out by Wednesday, but this week got a little crazy with planning for the Student Summit (if this is your first time on my blog, WELCOME and more details on the student summit can be found here), so I’m just now getting to all the other things. Planning to get back on track this week!

Also, this is another paper that is behind a paywall. In the future, I’m going to do my best to pick papers that are open sourced, but you don’t need the paper to get the idea for this one.

Welcome back to Paper of the Week!

For the first time ever, I did not find this paper on Twitter. In fact, it is not a new paper - it was published in 2016. It’s also not on “hard science” - instead, we’re talking about science advocacy.

Why am I writing about this? Well, in an era where new technology touches the average person’s life well before it is possible for governments to regulate it, scientists are one of the few groups who have the power to explain the possible consequences of using that technology, positive or not. However, communication and advocacy have not been a huge part of the academic community in the past, even between scientists, so we haven’t seen scientists step up to the plate as much as (I believe) they could. Similarly, science is often driven by whether or not we can do something, not whether or not we should, and I think that the latter should become a bigger part of the scientific method. Just because we can develop new technology, doesn’t mean that we shouldn’t be mindful of the ways that technology can be misused.

On to the paper: Today, we’re focusing on nanotechnology. This paper, titled “Scientists’ Ethical Obligations and Social Responsibility for Nanotechnology Research,” was written by Dr. Elizabeth Corley, Dr, Youngjae Kim, and Dr. Dietram Scheufele at Arizona State University and the University of Wisconsin-Madison.

Corley, Kim, and Scheufele note off the bat that policy change often lags behind emerging technology, allowing new science to go unchecked by the government, and that there have been some progress within the scientific community with regard to policy advocacy. Similarly, there have been studies on how the average person’s opinions on new technology depend on their personal values. However, there had not been much research into how scientists and engineers’ opinions on new technology are affected by their values. Luckily for us, Corley, Kim, and Scheufele decided to ask some nano-scientists, and report their findings.

What did they find out? Well, a lot of things:

  • Female nano-scientists had a stronger sense of social and ethical responsibility for their research than male scientists.

  • Academic scientists were more likely to say that lab directors had an ethical obligation to keep their staff safe than non-academic scientists.

  • Academic scientists were more likely to say that scientists should be allowed to do research if they respected ethical standards than non-academic scientists

  • Non-academic scientists were more likely to say that federal funding agencies should require the labs the fund to have guidelines to protect their workers than academic scientists

  • Sense of social and ethical responsibility did not vary with political affiliation

  • Scientists who support nanotechnology regulations at the local or state level had a weaker sense of responsibility

  • Scientists who paid more attention to media coverage of social/ethical issues were more likely to think that authorities should require scientists to respect ethical standards

  • Scientists who had a higher risk perception - that is, who felt that they might experience negative consequences due to research activities - were less likely to believe that scientists should be allowed to do whatever research they want to.

  • And much more!

This paper had several hypotheses and even more results, so I’m going to cut it short there. There are a few things to keep in mind when you think about these results:

  • The survey they sent out was answered fully by 444 people. Given that there are way more than 444 nano-scientists, let alone scientists, in the world, these results can’t necessarily be generalized, and probably shouldn’t be.

  • 82.5% of the people who answered the survey were men. So, when I say that female scientists had a stronger sense of responsibility than men, keep in mind that the results may be a bit biased because of the small number of women who responded.

  • Similarly, 2% of the people who responded were Hispanic, and 0.5% were African American. 63.5% identified as White, and 31.8% identified as Asian. In other words, racial diversity was not well represented in this survey, and a more diverse set of survey respondents might lead to different results.

  • 74% of the people who responded were academic researchers, so there may also be some bias in those comparisons.

What can we take from this? Well, it probably doesn’t surprise anyone that people’s values affect their opinions on social and ethical issues. However, it should be noted that funding sources seem to impact perceptions of responsibility, in spite of the fact that your funding source isn’t necessarily related to issues of social and ethical responsibility. In addition, “ethical standards” isn’t really defined in this paper, so it probably varies between scientists and between fields. We should make an effort to broadly define “ethics standards” that could apply to all fields, as well as standards for specific areas of science, so that we can mount a united front as policy advocates and incorporate the concept of social and ethics responsibility in science into science funding standards. Although government regulation may not always be able to keep up with scientific advancements, we can all do our part to help policymakers support emerging technology without allowing potentially dangerous advancements to go unchecked.