976 resultados para threshold voltage model
Resumo:
Hafnium oxide (HfOn) is a promising dielectric for future microelectronic applications. Hf02 thin films (10-75nm) were deposited on Pt/Si02/Si substrates by pulsed DC magnetron reactive sputtering. Top electrodes of Pt were formed by e-beam evapo- ration through an aperture mask on the samples to create MIM (Metal-Insulator-Metal) capacitors. Various processing conditions (Arloz ratio, DC power and deposition rate) and post-deposition annealing conditions (time and temperature) were investigated. The structure of the Hf02 films was characterized by X-ray diffraction (XRD) and the roughness was measured by a profilometer. The electrical properties were characterized in terms of their relative permittivity (E,(T) and ~,.(f)) and leakage behavior (I-V, I-T and I- time). The electrical measurements were performed over a temperature range from -5 to 200°C. For the samples with best experimental results, the relative permittivity of HfOa was found to be -- 27 after anneal and increased by 0.027%/"C with increasing temperature over the measured temperature range. At 25"C, leakage current density was below lop8 ~ l c m ' at 1 volt. The leakage current increased with temperature above a specific threshold temperature below which the leakage current didn't change much. The leakage current increased with voltage. At voltages below lvolt, it's ohmic; at higher voltages, it follows Schottky model. The breakdown field is - 1 . 8 2 ~ lo6 Vlcm. The optical bandgap was measured with samples deposited on quartz substrates to be 5.4eV after anneal.
Resumo:
The doctrine of fair use allows unauthorized copying of original works of art, music, and literature for limited purposes like criticism, research, and education, based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops the first formal analysis of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. Application of the analysis to several key cases (including the recent Napster case) shows that this interpretation is consistent with actual legal reasoning. The analysis also underscores the role of technology in shaping the efficient scope of fair use.
Resumo:
The doctrine of fair use allows limited copying of creative works based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops a formal model of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. The analysis highlights the role of technology in shaping the efficient standard. Discussion of several key cases illustrates the applicability of the model.
Resumo:
ACCURACY OF THE BRCAPRO RISK ASSESSMENT MODEL IN MALES PRESENTING TO MD ANDERSON FOR BRCA TESTING Publication No. _______ Carolyn A. Garby, B.S. Supervisory Professor: Banu Arun, M.D. Hereditary Breast and Ovarian Cancer (HBOC) syndrome is due to mutations in BRCA1 and BRCA2 genes. Women with HBOC have high risks to develop breast and ovarian cancers. Males with HBOC are commonly overlooked because male breast cancer is rare and other male cancer risks such as prostate and pancreatic cancers are relatively low. BRCA genetic testing is indicated for men as it is currently estimated that 4-40% of male breast cancers result from a BRCA1 or BRCA2 mutation (Ottini, 2010) and management recommendations can be made based on genetic test results. Risk assessment models are available to provide the individualized likelihood to have a BRCA mutation. Only one study has been conducted to date to evaluate the accuracy of BRCAPro in males and was based on a cohort of Italian males and utilized an older version of BRCAPro. The objective of this study is to determine if BRCAPro5.1 is a valid risk assessment model for males who present to MD Anderson Cancer Center for BRCA genetic testing. BRCAPro has been previously validated for determining the probability of carrying a BRCA mutation, however has not been further examined particularly in males. The total cohort consisted of 152 males who had undergone BRCA genetic testing. The cohort was stratified by indication for genetic counseling. Indications included having a known familial BRCA mutation, having a personal diagnosis of a BRCA-related cancer, or having a family history suggestive of HBOC. Overall there were 22 (14.47%) BRCA1+ males and 25 (16.45%) BRCA2+ males. Receiver operating characteristic curves were constructed for the cohort overall, for each particular indication, as well as for each cancer subtype. Our findings revealed that the BRCAPro5.1 model had perfect discriminating ability at a threshold of 56.2 for males with breast cancer, however only 2 (4.35%) of 46 were found to have BRCA2 mutations. These results are significantly lower than the high approximation (40%) reported in previous literature. BRCAPro does perform well in certain situations for men. Future investigation of male breast cancer and men at risk for BRCA mutations is necessary to provide a more accurate risk assessment.
Resumo:
Developing a Model Interruption is a known human factor that contributes to errors and catastrophic events in healthcare as well as other high-risk industries. The landmark Institute of Medicine (IOM) report, To Err is Human, brought attention to the significance of preventable errors in medicine and suggested that interruptions could be a contributing factor. Previous studies of interruptions in healthcare did not offer a conceptual model by which to study interruptions. As a result of the serious consequences of interruptions investigated in other high-risk industries, there is a need to develop a model to describe, understand, explain, and predict interruptions and their consequences in healthcare. Therefore, the purpose of this study was to develop a model grounded in the literature and to use the model to describe and explain interruptions in healthcare. Specifically, this model would be used to describe and explain interruptions occurring in a Level One Trauma Center. A trauma center was chosen because this environment is characterized as intense, unpredictable, and interrupt-driven. The first step in developing the model began with a review of the literature which revealed that the concept interruption did not have a consistent definition in either the healthcare or non-healthcare literature. Walker and Avant’s method of concept analysis was used to clarify and define the concept. The analysis led to the identification of five defining attributes which include (1) a human experience, (2) an intrusion of a secondary, unplanned, and unexpected task, (3) discontinuity, (4) externally or internally initiated, and (5) situated within a context. However, before an interruption could commence, five conditions known as antecedents must occur. For an interruption to take place (1) an intent to interrupt is formed by the initiator, (2) a physical signal must pass a threshold test of detection by the recipient, (3) the sensory system of the recipient is stimulated to respond to the initiator, (4) an interruption task is presented to recipient, and (5) the interruption task is either accepted or rejected by v the recipient. An interruption was determined to be quantifiable by (1) the frequency of occurrence of an interruption, (2) the number of times the primary task has been suspended to perform an interrupting task, (3) the length of time the primary task has been suspended, and (4) the frequency of returning to the primary task or not returning to the primary task. As a result of the concept analysis, a definition of an interruption was derived from the literature. An interruption is defined as a break in the performance of a human activity initiated internal or external to the recipient and occurring within the context of a setting or location. This break results in the suspension of the initial task by initiating the performance of an unplanned task with the assumption that the initial task will be resumed. The definition is inclusive of all the defining attributes of an interruption. This is a standard definition that can be used by the healthcare industry. From the definition, a visual model of an interruption was developed. The model was used to describe and explain the interruptions recorded for an instrumental case study of physicians and registered nurses (RNs) working in a Level One Trauma Center. Five physicians were observed for a total of 29 hours, 31 minutes. Eight registered nurses were observed for a total of 40 hours 9 minutes. Observations were made on either the 0700–1500 or the 1500-2300 shift using the shadowing technique. Observations were recorded in the field note format. The field notes were analyzed by a hybrid method of categorizing activities and interruptions. The method was developed by using both a deductive a priori classification framework and by the inductive process utilizing line-byline coding and constant comparison as stated in Grounded Theory. The following categories were identified as relative to this study: Intended Recipient - the person to be interrupted Unintended Recipient - not the intended recipient of an interruption; i.e., receiving a phone call that was incorrectly dialed Indirect Recipient – the incidental recipient of an interruption; i.e., talking with another, thereby suspending the original activity Recipient Blocked – the intended recipient does not accept the interruption Recipient Delayed – the intended recipient postpones an interruption Self-interruption – a person, independent of another person, suspends one activity to perform another; i.e., while walking, stops abruptly and talks to another person Distraction – briefly disengaging from a task Organizational Design – the physical layout of the workspace that causes a disruption in workflow Artifacts Not Available – supplies and equipment that are not available in the workspace causing a disruption in workflow Initiator – a person who initiates an interruption Interruption by Organizational Design and Artifacts Not Available were identified as two new categories of interruption. These categories had not previously been cited in the literature. Analysis of the observations indicated that physicians were found to perform slightly fewer activities per hour when compared to RNs. This variance may be attributed to differing roles and responsibilities. Physicians were found to have more activities interrupted when compared to RNs. However, RNs experienced more interruptions per hour. Other people were determined to be the most commonly used medium through which to deliver an interruption. Additional mediums used to deliver an interruption vii included the telephone, pager, and one’s self. Both physicians and RNs were observed to resume an original interrupted activity more often than not. In most interruptions, both physicians and RNs performed only one or two interrupting activities before returning to the original interrupted activity. In conclusion the model was found to explain all interruptions observed during the study. However, the model will require an even more comprehensive study in order to establish its predictive value.
Resumo:
Long-term potentiation (LTP) is a rapidly induced and long lasting increase in synaptic strength and is the leading cellular model for learning and memory in the mammalian brain. LTP was first identified in the hippocampus, a structure implicated in memory formation. LTP induction is dependent on postsynaptic Ca2+ increases mediated by N-methyl-D-aspartate (NMDA) receptors. Activation of other postsynaptic routes of Ca2+ entry, such as voltage-dependent Ca2+ channels (VDCCs) have subsequently been shown to induce a long-lasting increase in synaptic strength. However, it is unknown if VDCC-induced LTP utilized similar cellular mechanisms as the classical NMDA receptor-dependent LTP and if these two forms of LTP display similar properties. This dissertation determines the similarities and differences in VDCC and NMDA receptor-dependent LTP in area CA1 of hippocampal slices and demonstrates that VDCCs and NMDA receptors activate similar cellular mechanisms, such as protein kinases, to induce LTP. However, VDCC and NMDA receptor activated LTP induction mechanisms are compartmentalized in the postsynaptic neuron, such that they do not interact. Consistent with activation properties of NMDA receptors and VDCCs, NMDA receptor and VDCC-dependent LTP have different induction properties. In contrast to NMDA-dependent LTP, VDCC-induced potentiation does not require evoked presynaptic stimulation or display input specificity. These results indicate that there are two different routes of postsynaptic Ca2+ which can induce LTP and the compartmentation of VDCCs and NMDA receptors and/or their resulting Ca2+ increases may account for the distinction between these LTP induction mechanisms.^ One of the molecular targets for postsynaptic Ca2+ that is required for the induction of LTP is protein kinases. Evidence for the role of protein kinase activity in LTP expression is either correlational or controversial. We have utilized a broad range and potent inhibitors of protein kinases to systematically examine the temporal requirement for protein kinases in the induction and expression of LTP. Our results indicate that there is a critical period of persistent protein kinase activity required for LTP induction activated by tetanic stimulation and extending until 20 min after HFS. In addition, our results suggest that protein kinase activity during and immediately after HFS is not sufficient for LTP induction. These results provide evidence for persistent and/or Ca2+ independent protein kinase activity involvement in LTP induction. ^
Neocortical hyperexcitability defect in a mutant mouse model of spike-wave epilepsy, {\it stargazer}
Resumo:
Single-locus mutations in mice can express epileptic phenotypes and provide critical insights into the naturally occurring defects that alter excitability and mediate synchronization in the central nervous system (CNS). One such recessive mutation (on chromosome (Chr) 15), stargazer(stg/stg) expresses frequent bilateral 6-7 cycles per second (c/sec) spike-wave seizures associated with behavioral arrest, and provides a valuable opportunity to examine the inherited lesion associated with spike-wave synchronization.^ The existence of distinct and heterogeneous defects mediating spike-wave discharge (SWD) generation has been demonstrated by the presence of multiple genetic loci expressing generalized spike-wave activity and the differential effects of pharmacological agents on SWDs in different spike-wave epilepsy models. Attempts at understanding the different basic mechanisms underlying spike-wave synchronization have focused on $\gamma$-aminobutyric acid (GABA) receptor-, low threshold T-type Ca$\sp{2+}$ channel-, and N-methyl-D-aspartate receptor (NMDA-R)-mediated transmission. It is believed that defects in these modes of transmission can mediate the conversion of normal oscillations in a trisynaptic circuit, which includes the neocortex, reticular nucleus and thalamus, into spike-wave activity. However, the underlying lesions involved in spike-wave synchronization have not been clearly identified.^ The purpose of this research project was to locate and characterize a distinct neuronal hyperexcitability defect favoring spike-wave synchronization in the stargazer brain. One experimental approach for anatomically locating areas of synchronization and hyperexcitability involved an attempt to map patterns of hypersynchronous activity with antibodies to activity-induced proteins.^ A second approach to characterizing the neuronal defect involved examining the neuronal responses in the mutant following application of pharmacological agents with well known sites of action.^ In order to test the hypothesis that an NMDA receptor mediated hyperexcitability defect exists in stargazer neocortex, extracellular field recordings were used to examine the effects of CPP and MK-801 on coronal neocortical brain slices of stargazer and wild type perfused with 0 Mg$\sp{2+}$ artificial cerebral spinal fluid (aCSF).^ To study how NMDA receptor antagonists might promote increased excitability in stargazer neocortex, two basic hypotheses were tested: (1) NMDA receptor antagonists directly activate deep layer principal pyramidal cells in the neocortex of stargazer, presumably by opening NMDA receptor channels altered by the stg mutation; and (2) NMDA receptor antagonists disinhibit the neocortical network by blocking recurrent excitatory synaptic inputs onto inhibitory interneurons in the deep layers of stargazer neocortex.^ In order to test whether CPP might disinhibit the 0 Mg$\sp{2+}$ bursting network in the mutant by acting on inhibitory interneurons, the inhibitory inputs were pharmacologically removed by application of GABA receptor antagonists to the cortical network, and the effects of CPP under 0 Mg$\sp{2+}$aCSF perfusion in layer V of stg/stg were then compared with those found in +/+ neocortex using in vitro extracellular field recordings. (Abstract shortened by UMI.) ^
Resumo:
In this paper, a new digital elevation model (DEM) is derived for the ice sheet in western Dronning Maud Land, Antarctica. It is based on differential interferometric synthetic aperture radar (SAR) from the European Remote Sensing 1/2 (ERS-1/2) satellites, in combination with ICESat's Geoscience Laser Altimeter System (GLAS). A DEM mosaic is compiled out of 116 scenes from the ERS-1 ice phase in 1994 and the ERS-1/2 tandem mission between 1996 and 1997 with the GLAS data acquired in 2003 that served as ground control. Using three different SAR processors, uncertainties in phase stability and baseline model, resulting in height errors of up to 20 m, are exemplified. Atmospheric influences at the same order of magnitude are demonstrated, and corresponding scenes are excluded. For validation of the DEM mosaic, covering an area of about 130,000 km**2 on a 50-m grid, independent ICESat heights (2004-2007), ground-based kinematic GPS (2005), and airborne laser scanner data (ALS, 2007) are used. Excluding small areas with low phase coherence, the DEM differs in mean and standard deviation by 0.5 +/- 10.1, 1.1 +/- 6.4, and 3.1 +/- 4.0 m from ICESat, GPS, and ALS, respectively. The excluded data points may deviate by more than 50 m. In order to suppress the spatially variable noise below a 5-m threshold, 18% of the DEM area is selectively averaged to a final product at varying horizontal spatial resolution. Apart from mountainous areas, the new DEM outperforms other currently available DEMs and may serve as a benchmark for future elevation models such as from the TanDEM-X mission to spatially monitor ice sheet elevation.
Resumo:
These data are provided to allow users for reproducibility of an open source tool entitled 'automated Accumulation Threshold computation and RIparian Corridor delineation (ATRIC)'
Resumo:
A two-dimensional finite element model of current flow in the front surface of a PV cell is presented. In order to validate this model we perform an experimental test. Later, particular attention is paid to the effects of non-uniform illumination in the finger direction which is typical in a linear concentrator system. Fill factor, open circuit voltage and efficiency are shown to decrease with increasing degree of non-uniform illumination. It is shown that these detrimental effects can be mitigated significantly by reoptimization of the number of front surface metallization fingers to suit the degree of non-uniformity. The behavior of current flow in the front surface of a cell operating at open circuit voltage under non-uniform illumination is discussed in detail.
Resumo:
This paper shows that today’s modelling of electrical noise as coming from noisy resistances is a non sense one contradicting their nature as systems bearing an electrical noise. We present a new model for electrical noise that including Johnson and Nyquist work also agrees with the Quantum Mechanical description of noisy systems done by Callen and Welton, where electrical energy fluctuates and is dissipated with time. By the two currents the Admittance function links in frequency domain with their common voltage, this new model shows the connection Cause-Effect that exists between Fluctuation and Dissipation of energy in time domain. In spite of its radical departure from today’s belief on electrical noise in resistors, this Complex model for electrical noise is obtained from Nyquist result by basic concepts of Circuit Theory and Thermo- dynamics that also apply to capacitors and inductors.
Resumo:
To improve percolation modelling on soils the geometrical properties of the pore space must be understood; this includes porosity, particle and pore size distribution and connectivity of the pores. A study was conducted with a soil at different bulk densities based on 3D grey images acquired by X-ray computed tomography. The objective was to analyze the effect in percolation of aspects of pore network geometry and discuss the influence of the grey threshold applied to the images. A model based on random walk algorithms was applied to the images, combining five bulk densities with up to six threshold values per density. This allowed for a dynamical perspective of soil structure in relation to water transport through the inclusion of percolation speed in the analyses. To evaluate separately connectivity and isolate the effect of the grey threshold, a critical value of 35% of porosity was selected for every density. This value was the smallest at which total-percolation walks appeared for the all images of the same porosity and may represent a situation of percolation comparable among bulks densities. This criterion avoided an arbitrary decision in grey thresholds. Besides, a random matrix simulation at 35% of porosity with real images was used to test the existence of pore connectivity as a consequence of a non-random soil structure.
Resumo:
In recent decades, there has been an increasing interest in systems comprised of several autonomous mobile robots, and as a result, there has been a substantial amount of development in the eld of Articial Intelligence, especially in Robotics. There are several studies in the literature by some researchers from the scientic community that focus on the creation of intelligent machines and devices capable to imitate the functions and movements of living beings. Multi-Robot Systems (MRS) can often deal with tasks that are dicult, if not impossible, to be accomplished by a single robot. In the context of MRS, one of the main challenges is the need to control, coordinate and synchronize the operation of multiple robots to perform a specic task. This requires the development of new strategies and methods which allow us to obtain the desired system behavior in a formal and concise way. This PhD thesis aims to study the coordination of multi-robot systems, in particular, addresses the problem of the distribution of heterogeneous multi-tasks. The main interest in these systems is to understand how from simple rules inspired by the division of labor in social insects, a group of robots can perform tasks in an organized and coordinated way. We are mainly interested on truly distributed or decentralized solutions in which the robots themselves, autonomously and in an individual manner, select a particular task so that all tasks are optimally distributed. In general, to perform the multi-tasks distribution among a team of robots, they have to synchronize their actions and exchange information. Under this approach we can speak of multi-tasks selection instead of multi-tasks assignment, which means, that the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation ix of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. In addition, it is very interesting the evaluation of the results in function in each approach, comparing the results obtained by the introducing noise in the number of pending loads, with the purpose of simulate the robot's error in estimating the real number of pending tasks. The main contribution of this thesis can be found in the approach based on self-organization and division of labor in social insects. An experimental scenario for the coordination problem among multiple robots, the robustness of the approaches and the generation of dynamic tasks have been presented and discussed. The particular issues studied are: Threshold models: It presents the experiments conducted to test the response threshold model with the objective to analyze the system performance index, for the problem of the distribution of heterogeneous multitasks in multi-robot systems; also has been introduced additive noise in the number of pending loads and has been generated dynamic tasks over time. Learning automata methods: It describes the experiments to test the learning automata-based probabilistic algorithms. The approach was tested to evaluate the system performance index with additive noise and with dynamic tasks generation for the same problem of the distribution of heterogeneous multi-tasks in multi-robot systems. Ant colony optimization: The goal of the experiments presented is to test the ant colony optimization-based deterministic algorithms, to achieve the distribution of heterogeneous multi-tasks in multi-robot systems. In the experiments performed, the system performance index is evaluated by introducing additive noise and dynamic tasks generation over time.