975 resultados para harm minimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An introduction to Fourier Series based on the minimization of the least square error between an approximate series representation and the exact function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statement of the problem and public health significance. Hospitals were designed to be a safe haven and respite from disease and illness. However, a large body of evidence points to preventable errors in hospitals as the eighth leading cause of death among Americans. Twelve percent of Americans, or over 33.8 million people, are hospitalized each year. This population represents a significant portion of at risk citizens exposed to hospital medical errors. Since the number of annual deaths due to hospital medical errors is estimated to exceed 44,000, the magnitude of this tragedy makes it a significant public health problem. ^ Specific aims. The specific aims of this study were threefold. First, this study aimed to analyze the state of the states' mandatory hospital medical error reporting six years after the release of the influential IOM report, "To Err is Human." The second aim was to identify barriers to reporting of medical errors by hospital personnel. The third aim was to identify hospital safety measures implemented to reduce medical errors and enhance patient safety. ^ Methods. A descriptive, longitudinal, retrospective design was used to address the first stated objective. The study data came from the twenty-one states with mandatory hospital reporting programs which report aggregate hospital error data that is accessible to the public by way of states' websites. The data analysis included calculations of expected number of medical errors for each state according to IOM rates. Where possible, a comparison was made between state reported data and the calculated IOM expected number of errors. A literature review was performed to achieve the second study aim, identifying barriers to reporting medical errors. The final aim was accomplished by telephone interviews of principal patient safety/quality officers from five Texas hospitals with more than 700 beds. ^ Results. The state medical error data suggests vast underreporting of hospital medical errors to the states. The telephone interviews suggest that hospitals are working at reducing medical errors and creating safer environments for patients. The literature review suggests the underreporting of medical errors at the state level stems from underreporting of errors at the delivery level. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The implementation of abstract machines involves complex decisions regarding, e.g., data representation, opcodes, or instruction specialization levéis, all of which affect the final performance of the emulator and the size of the bytecode programs in ways that are often difficult to foresee. Besides, studying alternatives by implementing abstract machine variants is a time-consuming and error-prone task because of the level of complexity and optimization of competitive implementations, which makes them generally difficult to understand, maintain, and modify. This also makes it hard to genérate specific implementations for particular purposes. To ameliorate those problems, we propose a systematic approach to the automatic generation of implementations of abstract machines. Different parts of their definition (e.g., the instruction set or the infernal data and bytecode representation) are kept sepárate and automatically assembled in the generation process. Alternative versions of the abstract machine are therefore easier to produce, and variants of their implementation can be created mechanically, with specific characteristics for a particular application if necessary. We illustrate the practicality of the approach by reporting on an implementation of a generator of production-quality WAMs which are specialized for executing a particular fixed (set of) program(s). The experimental results show that the approach is effective in reducing emulator size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy consumption in data centers is nowadays a critical objective because of its dramatic environmental and economic impact. Over the last years, several approaches have been proposed to tackle the energy/cost optimization problem, but most of them have failed on providing an analytical model to target both the static and dynamic optimization domains for complex heterogeneous data centers. This paper proposes and solves an optimization problem for the energy-driven configuration of a heterogeneous data center. It also advances in the proposition of a new mechanism for task allocation and distribution of workload. The combination of both approaches outperforms previous published results in the field of energy minimization in heterogeneous data centers and scopes a promising area of research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The molten globule, a widespread protein-folding intermediate, can attain a native-like backbone topology, even in the apparent absence of rigid side-chain packing. Nonetheless, mutagenesis studies suggest that molten globules are stabilized by some degree of side-chain packing among specific hydrophobic residues. Here we investigate the importance of hydrophobic side-chain diversity in determining the overall fold of the α-lactalbumin molten globule. We have replaced all of the hydrophobic amino acids in the sequence of the helical domain with a representative amino acid, leucine. Remarkably, the minimized molecule forms a molten globule that retains many structural features characteristic of a native α-lactalbumin fold. Thus, nonspecific hydrophobic interactions may be sufficient to determine the global fold of a protein.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To identify and synthesise the findings from all randomised controlled trials that have examined the effectiveness of treatments of patients who have deliberately harmed themselves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hierarchical properties of potential energy landscapes have been used to gain insight into thermodynamic and kinetic properties of protein ensembles. It also may be possible to use them to direct computational searches for thermodynamically stable macroscopic states, i.e., computational protein folding. To this end, we have developed a top-down search procedure in which conformation space is recursively dissected according to the intrinsic hierarchical structure of a landscape's effective-energy barriers. This procedure generates an inverted tree similar to the disconnectivity graphs generated by local minima-clustering methods, but it fundamentally differs in the manner in which the portion of the tree that is to be computationally explored is selected. A key ingredient is a branch-selection algorithm that takes advantage of statistically predictive properties of the landscape to guide searches down the tree branches that are most likely to lead to the physically relevant macroscopic states. Using the computational folding of a β-hairpin-forming peptide as an example, we show that such predictive properties indeed exist and can be used for structure prediction by free-energy global minimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On September 17, 2015, the Federal Circuit issued another decision in the epic Apple v. Samsung smartphone war. This was the fourth court decision in the ongoing saga to deal with injunctions. Apple IV explained the level of proof necessary to satisfy the "causal nexus" requirement. This requirement had emerged as a response to patent litigations involving products with thousands of features, the vast majority of which are unrelated to the asserted patent. To prove a causal nexus, patentees seeking an injunction have to do more than just show that the infringing product caused the patentee irreparable harm. The harm must be specifically attributable to the infringing feature. In Apple IV, the Federal Circuit noted that proving causation was "nearly impossible" in these multicomponent cases. So it decided to water down the causal nexus requirement saying that it was enough for Apple to show that the infringing features were "important"and customer sought these particular features. This lower standard is an ill-advised mistake that leaves multicomponent product manufacturers more susceptible to patent holdup. My critique takes two parts. First, I argue that a single infringing feature rarely, if ever, "causes" consumers to buy the infringer’s multicomponent products. The minor features at issue in Apple IV illustrate this point vividly. Thus, the new causal nexus standard does not accurately reflect how causation and harm operate in a multicomponent world. Second, I explain why the court was so willing to accept such little evidence of real injury. It improperly applied notions of traditional property law to patents. Specifically, the court viewed patent infringement as harmful regardless of any concrete consequences. This view may resonate for other forms of property where an owner's rights are paramount and a trespass is considered offensive in and of itself. But the same concepts do not apply to patent law where the Supreme Court has consistently said that private interests must take a back seat to the public good. Based on these principles, the courts should restore the "causal nexus" requirement and not presume causation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the European Commission’s antitrust investigation against Google approaches its final stages, its contours and likely outcome remain obscure and blurred by a plethora of nonantitrust-related arguments. At the same time, the initial focus on search neutrality as an antitrust principle seems to have been abandoned by the European Commission, in favour of a more standard allegation of ‘exclusionary abuse’, likely to generate anticompetitive foreclosure of Google’s rivals. This paper discusses search neutrality as an antitrust principle, and then comments on the current investigation based on publicly available information. The paper provides a critical assessment of the likely tests that will be used for the definition of the relevant product market, the criteria for the finding of dominance, the anticompetitive foreclosure test and the possible remedies that the European Commission might choose. Overall, and regardless of the outcome of the Google case, the paper argues that the current treatment of exclusionary abuses in Internet markets is in urgent need of a number of important clarifications, and has been in this condition for more than a decade. The hope is that the European Commission will resist the temptation to imbue the antitrust case with an emphasis and meaning that have nothing to do with antitrust (from industrial policy motives to privacy, copyright or media law arguments) and that, on the contrary, the Commission will devote its efforts to sharpening its understanding of dynamic competition in cyberspace, and the tools that should be applied in the analysis of these peculiar, fast-changing and often elusive settings.

Relevância:

20.00% 20.00%

Publicador: