共查询到20条相似文献,搜索用时 46 毫秒
1.
This article addresses a key question emerging from this project based at the University of Minnesota: the fundamental capacity
of government to engage in “dynamic oversight” of emergent technologies. This conception of oversight requires additional
or new types of capacity for government agencies that must arbitrate conflicts and endow any outcomes with necessary democratic
legitimacy. Rethinking oversight thus also requires consideration of the fundamental design and organizational capacity of
the regulatory regime in the democratic state. 相似文献
2.
John Howard 《Journal of nanoparticle research》2011,13(4):1427-1434
Nanotechnology is touted as a transformative technology in that it is predicted to improve many aspects of human life. There
are hundreds of products in the market that utilize nanostructures in their design, such as composite materials made out of
carbon or metal oxides. Potential risks to consumers, to the environment, and to workers from the most common passive nanomaterial—carbon
nanotubes—are emerging through scientific research. Newer more active nanostructures—such as cancer therapies and targeted
drug systems—are also increasing in use and are raising similar risk concerns. Governing the risks to workers is the subject
of this commentary. The Occupational Safety and Health Act of 1970 grants the Occupational Safety and Health Administration
the legal authority to set occupational health standards to insure that no worker suffers material impairment of health from
work. However, setting a standard to protect workers from nanotechnology risks may occur some time in the future because the
risks to workers have not been well characterized scientifically. Alternative risk governances—such as dynamic oversight through
stakeholder partnerships, “soft law” approaches, and national adoption of international consensus standards—are evaluated
in this article. 相似文献
3.
The basic procedures and rules for oversight of U.S. human subjects research have been in place since 1981. Certain types
of human subjects research, however, have provoked creation of additional mechanisms and rules beyond the Department of Health
& Human Services (DHHS) Common Rule and Food and Drug Administration (FDA) equivalent. Now another emerging domain of human
subjects research—nanomedicine—is prompting calls for extra oversight. However, in 30 years of overseeing research on human
beings, we have yet to specify what makes a domain of scientific research warrant extra oversight. This failure to systematically
evaluate the need for extra measures, the type of extra measures appropriate for different challenges, and the usefulness
of those measures hampers efforts to respond appropriately to emerging science such as nanomedicine. This article evaluates
the history of extra oversight, extracting lessons for oversight of nanomedicine research in human beings. We argue that a
confluence of factors supports the need for extra oversight, including heightened uncertainty regarding risks, fast-evolving
science yielding complex and increasingly active materials, likelihood of research on vulnerable participants including cancer
patients, and potential risks to others beyond the research participant. We suggest the essential elements of the extra oversight
needed. 相似文献
4.
Robbin S. Johnson 《Journal of nanoparticle research》2011,13(4):1467-1476
This article uses lessons from biotechnology to help inform the design of oversight for nanobiotechnology. Those lessons suggest
the following: first, oversight needs to be broadly defined, encompassing not just regulatory findings around safety and efficacy,
but also public understanding and acceptance of the technology and its products. Second, the intensity of scrutiny and review
should reflect not just risks but also perceptions of risk. Finally, a global marketplace argues for uniform standards or
commercially practical solutions to differences in standards. One way of designing oversight to achieve these purposes is
to think about it in three phases—precaution, prudence, and promotion. Precaution comes early in the technology or product’s
development and reflects real and perceived uncertainties. Prudence governs when risks and hazards have been identified, containment
approaches established, and benefits broadly defined. Transparency and public participation rise to the fore. The promotional
phase moves toward shaping public understanding and acceptance and involves marketing issues rather than safety ones. This
flexible, three-phase approach to oversight would have avoided some of the early regulatory problems with agricultural biotechnology.
It also would have led to a more risk-adjusted pathway to regulatory approval. Furthermore, it would avoid some of the arbitrary,
disruptive marketing issues that have arisen. 相似文献
5.
M. I. Sokolovsky S. N. Petukhov Yu. P. Semyonov B. A. Sokolov 《Thermophysics and Aeromechanics》2008,15(4):671-677
Successful experience of RSC “Energy” and SPA “Iskra” in the development of carbon-carbon extension for oxygen-kerosene liquid
fuel rocket motor has been summarized. Methodological approach that served to completion of carbon-carbon extension development
in full and at comparatively small expenses has been described. Results of practical application of carbon-carbon extension
for liquid fuel rocket motor 11D58M have been presented within the framework of International Space Program “Sea Launch”. 相似文献
6.
S. S. Sannikov-Proskuryakov 《Russian Physics Journal》1997,40(10):982-984
The value of the “bare” fine-structure constant α0 is calculated within the framework of a new approach to the problem of elementary particles.
Khar'kov Physics and Technology Institute. Translated from Izvestiya Vysshikh Uchebnykh Zavedenii, Fizika, No. 10. pp. 60–63,
October, 1997. 相似文献
7.
Despite the widespread commercial use of nanomaterials, regulators currently have a limited ability to characterize and manage
risks. There is a paucity of data available on the current production and use of nanomaterials and extreme scientific uncertainty
on most aspects of the risk assessment “causal chain.” Regulatory decisions will need to be made in the near-term in the absence
formal quantitative risk assessments. The article draws on examples from three different regulatory contexts—baseline data
monitoring efforts of the U.S. Environmental Protection Agency and California Department of Toxic Substances Control, prioritization
of risk information in the context of environmental releases, and mitigation of occupational risks—to argue for the use of
decision-analytic tools in lieu of formal risk assessment to help regulatory bodies. We advocate a “horses for courses” approach
whereby existing analytical tools (such as risk ranking, multi-criteria decision analysis, and “control banding” approaches)
might be adapted to regulators’ goals in particular decision contexts. While efforts to build new and modify existing tools
are underway, they need greater support from funding and regulatory agencies because innovative approaches are needed for
the “extreme” uncertainty problems that nanomaterials pose. 相似文献
8.
Despite the theory of neutrino oscillations being rather old, some of its basic issues are still being debated in the literature.
We discuss a number of such issues, including the relevance of the “same energy” and “same momentum” assumptions, the role
of quantum-mechanical uncertainty relations in neutrino oscillations, the dependence of the coherence and localization conditions
that ensure the observability of neutrino oscillations on neutrino energy and momentum uncertainties, the question of (in)dependence
of the oscillation probabilities on the neutrino production and detection processes, and the applicability limits of the stationary-source
approximation. We also develop a novel approach to calculation of the oscillation probability in the wave-packet approach,
based on the summation/integration conventions different from the standard one, which allows a new insight into the “same
energy” vs. “same momentum” problem. We also discuss a number of apparently paradoxical features of the theory of neutrino
oscillations.
The text was submitted by the authors in English. 相似文献
9.
J. Reichardt D. R. White 《The European Physical Journal B - Condensed Matter and Complex Systems》2007,60(2):217-224
We present a framework for automatically decomposing (“block-modeling”) the functional classes of agents within a complex
network. These classes are represented by the nodes of an image graph (“block model”) depicting the main patterns of connectivity
and thus functional roles in the network. Using a first principles approach, we derive a measure for the fit of a network
to any given image graph allowing objective hypothesis testing. From the properties of an optimal fit, we derive how to find
the best fitting image graph directly from the network and present a criterion to avoid overfitting. The method can handle
both two-mode and one-mode data, directed and undirected as well as weighted networks and allows for different types of links
to be dealt with simultaneously. It is non-parametric and computationally efficient. The concepts of structural equivalence
and modularity are found as special cases of our approach. We apply our method to the world trade network and analyze the
roles individual countries play in the global economy. 相似文献
10.
V. V. Siksin 《Bulletin of the Lebedev Physics Institute》2020,47(1):33-35
It is urgent to use a “warm liquid” TMS in large massive calorimeters (with a volume of several hundred liters). This direction in modern nuclear physics is referred to as “non-accelerator” experiments with low-background detectors. Such experiments are associated with the solution of most important problems to understand the Universe structure and search for new particles. These are the well-known problems for searching “dark matter” in the form of new weakly interacting particles, i.e., wimps, observations of coherent scattering of reactor neutrinos. Using this experiment, the standard model of electroweak interactions can be tested. The fully developed fabrication technology of large amounts of “warm liquid” ТМS (in collaboration with the State Research Institute of Chemistry and Technology of Organoelement Compounds) makes it possible to perform such experiments. 相似文献
11.
We introduce a new coupled map lattice model in which the weak interaction takes place via rare “collisions”. By “collision”
we mean a strong (possibly discontinuous) change in the system. For such models we prove uniqueness of the SRB measure and
exponential space-time decay of correlations. 相似文献
12.
E. S. Pikina 《Journal of Experimental and Theoretical Physics》2009,109(5):885-898
The Li-Kardar field theory approach is generalized to wetting smectic films and the “elastic” fluctuation-induced interaction
is obtained between the external flat bounding surface and distorted IA (isotropic liquid-smectic A) interface acting as an
“internal” (bulk) boundary of the wetting smectic film under the assumption that the IA interface is essentially “softer”
than the surface smectic layer. This field theory approach allows calculating the fluctuation-induced corrections in Hamiltonians
of the so-called “correlated” liquids confined by two surfaces, in the case where one of the bounding surfaces is “rough”
and with different types of surface smectic layer anchoring. We obtain that in practice, the account of thermal displacements
of the smectic layers in a wetting smectic film reduces to the addition of two contributions to the IA interface Hamiltonian.
The first, so-called local contribution describes the long-range thermal “elastic” repulsion of the fluctuating IA interface
from the flat bounding surface. The second, so-called nonlocal contribution is connected with the occurrence of an “elastic”
fluctuation-induced correction to the stiffness of the IA interface. An analytic expression for this correction is obtained. 相似文献
13.
S. R. Bogdanov 《Technical Physics》2009,54(1):25-32
Simple inequalities obtained from the spectral representation of pressure-strain-rate correlations are used for analyzing
nonlinear (in anisotropy parameter and mean strain rate) models of these correlations. These inequalities can be treated as
a new criterion of the validity of the models relative to the known realizability conditions. It is shown in particular that
the requirements following from this criterion are not met in the entire physically accessible domain even for models automatically
satisfying the “strong realizability” conditions. In many cases, violations are observed at a high, but not exotic, degree
of anisotropy corresponding, for example, to the wall regions of a flow in a channel. On the other hand, the new criterion
turns out to be constructive in model “calibration.” In particular, some simple constraints on the values of constants are
obtained for some types of models proceeding from the conditions of the fulfillment of this criterion. 相似文献
14.
Anastasios Mallios 《International Journal of Theoretical Physics》2008,47(7):1929-1948
The sort of approach claimed by the title of this article is realizable, at least, within the framework of ADG where we do not assume any “spacetime” supplying the dynamics we employ. The latter classical type of argument can naturally be included herewith along with its
concomitant impediments that are emanated therefrom and are essentially “absorbed”, technically speaking, by the proposed mechanism. So our approach, being “manifoldless” (thence, no smoothness, in the standard sense) does not contain any such issue, as before, according to the very definitions, being thus “singularities”-free. As a consequence, the equations that one would be able to formulate within the present set-up will be, by the very essence
of the matter, already the quantized ones.
Dedicated to Professor Rafael D. Sorkin on the occasion of his 60th birthday with much friendship and recognition of his creative
pursuit in theoretical physics. 相似文献
15.
16.
A. C. Mueller 《The European physical journal. Special topics》2009,176(1):179-191
While a considerable and world-wide growth of the nuclear share in the global energy mix is desirable for many reasons, a
major concern or objection is the long-term burden that is constituted by the radiotoxic waste from the spent fuel. The concept
of Partitioning & Transmutation, a scientific and technological answer, is therefore of high interest. Its deployment may
use dedicated “Transmuter” or “Burner” reactors, using a fast neutron spectrum. For the transmutation of waste with a large
content (up to 50%) of (very long-lived) Minor Actinides, a sub-critical reactor, using an external neutron source is a solution
of high interest. It is constituted by coupling a proton accelerator, a spallation target and a subcritical core. This promising
new technology is named ADS, for accelerator-driven system. The present paper aims at an introduction into the field in order
to focus, in its later part, on the development of the required accelerator technology. 相似文献
17.
The “high strength-low plasticity resource” dilemma associated with the macrolocalization of plastic deformation in the form
of a neck in a stretched specimen, which leads to ductile failure of the specimen, has been theoretically discussed in the
framework of the dislocation-kinetic approach. It has been quantitatively demonstrated using micro- and nanocrystalline metals
as an example that their low plasticity resource (a small value of uniform strain before the beginning of the neck formation)
and quasi-embrittlement result from the strong increase in the yield strength with a decrease in the grain size and the strain-hardening
coefficient due to the annihilation of dislocations in the boundaries and bulk of grains. 相似文献
18.
A. Abragam 《Hyperfine Interactions》1986,31(1-4):3-10
The purpose of this talk is to replace μSR into the general framework of what I shall term “Spin spectroscopy” and more specifically:
“Resonance spin spectroscopy”. 相似文献
19.
Rafael D. Sorkin 《International Journal of Theoretical Physics》1997,36(12):2759-2781
In seeking to arrive at a theory of “quantum gravity,” one faces several choices among alternative approaches. I list some
of these “forks in the road” and offer reasons for taking one alternative over the other. In particular, I advocate the following:
the sum-over-histories framework for quantum dynamics over the “observable and state-vector” framework; relative probabilities
over absolute ones; spacetime over space as the gravitational “substance” (4 over 3+1); a Lorentzian metric over a Riemannian
(“Euclidean”) one; a dynamical topology over an absolute one; degenerate metrics over closed timelike curves to mediate topology
change; “unimodular gravity” over the unrestricted functional integral; and taking a discrete underlying structure (the causal
set) rather than the differentiable manifold as the basis of the theory. In connection with these choices, I also mention
some results from unimodular quantum cosmology, sketch an account of the origin of black hole entropy, summarize an argument
that the quantum mechanical measurement scheme breaks down for quantum field theory, and offer a reason why the cosmological
constant of the present epoch might have a magnitude of around 10−120 in natural units.
This paper is the text of a talk given at the symposium on Directions in General Relativity held at the University of Maryland,
College Park, Maryland, in May 1993 in honor of Dieter Brill and Charles Minser. 相似文献
20.
R. P. Malik 《Physics of Particles and Nuclei Letters》2011,8(3):244-250
We derive absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) and anti-BRST symmetry transformations for the 4D free
Abelian 2-form gauge theory by exploiting the superfield approach to BRST formalism. The antisymmetric tensor gauge field
of the above theory was christened as the “notoph” (i.e. the opposite of “photon”) gauge field by Ogievetsky and Palubarinov
way back in 1966–67. We briefly outline the problems involved in obtaining the absolute anticonimutativity of the (anti-)
BRST transformations and their resolution within the framework of geometrical superfield approach to BRST formalism. One of
the highlights of our results is the emergence of a Curci-Ferrari type of restriction in the context of 4D Abelian 2-form
(notoph) gauge theory which renders the nilpotent (anti-) BRST symmetries of the theory to be absolutely anticommutative in
nature. 相似文献