810 resultados para ease of discovery
Resumo:
Because the recommendation to use flowables for posterior restorations is still a matter of debate, the objective of this study was to determine in a nationwide survey in Germany how frequently, for what indications, and for what reasons, German dentists use flowable composites in posterior teeth. In addition, the acceptance of a simplified filling technique for posterior restorations using a low stress flowable composite was evaluated. Completed questionnaires from all over Germany were returned by 1,449 dentists resulting in a response rate of 48.5%; 78.6% of whom regularly used flowable composites for posterior restorations. The most frequent indications were cavity lining (80.1%) and small Class I fillings (74.2%). Flowables were less frequently used for small Class II fillings (22.7%) or other indications (13.6%). Most frequent reasons given for the use of flowables in posterior teeth were the prevention of voids (71.7%) and superior adaptation to cavity walls (72.9%), whereas saving time was considered less important (13.8%). Based on the subjective opinion of the dentists the simplified filling technique seemed to deliver advantages compared to the methods used to date particularly with regard to good cavity adaptation and ease of use. In conclusion, resin composites are the standard material type used for posterior restorations by general dental practitioners in Germany and most dentists use flowable composites as liners.
Resumo:
The study of musical timbre by Bailes (2007) raises important questions concerning the relative ease of imaging complex perceptual attributes such as timbre, compared to more unidimensional attributes. I also raise the issue of individual differences in auditory imagery ability, especially for timbre.
Resumo:
More than 250,000 hip fractures occur annually in the United States and the most common fracture location is the femoral neck, the weakest region of the femur. Hip fixation surgery is conducted to repair hip fractures by using a Kirschner (K-) wire as a temporary guide for permanent bone screws. Variation has been observed in the force required to extract the K-wire from the femoral head during surgery. It is hypothesized that a relationship exists between the K-wire pullout force and the bone quality at the site of extraction. Currently, bone mineral density (BMD) is used as a predictor for bone quality and strength. However, BMD characterizes the entire skeletal system and does not account for localized bone quality and factors such as lifestyle, nutrition, and drug use. A patient’s BMD may not accurately describe the quality of bone at the site of fracture. This study aims to investigate a correlation between the force required to extract a K-wire from femoral head specimens and the quality of bone. A procedure to measure K-wire pullout force was developed and tested with pig femoral head specimens. The procedure was implemented on 8 human osteoarthritic femoral head specimens and the average pullout force for each ranged from 563.32 ± 240.38 N to 1041.01 ± 346.84 N. The data exhibited significant variation within and between each specimen and no statistically significant relationships were determined between pullout force and patient age, weight, height, BMI, inorganic to organic matter ratio, and BMD. A new testing fixture was designed and manufactured to merge the clinical and research environments by enabling the physician to extract the K-wire from each bone specimen himself. The new device allows the physician to gather tactile feedback on the relative ease of extraction while load history is recorded similar to the previous procedure for data acquisition. Future work will include testing human bones with the new device to further investigate correlations for predicting bone quality.
Resumo:
Anaerobic digestion of food scraps has the potential to accomplish waste minimization, energy production, and compost or humus production. At Bucknell University, removal of food scraps from the waste stream could reduce municipal solid waste transportation costs and landfill tipping fees, and provide methane and humus for use on campus. To determine the suitability of food waste produced at Bucknell for high-solids anaerobic digestion (HSAD), a year-long characterization study was conducted. Physical and chemical properties, waste biodegradability, and annual production of biodegradable waste were assessed. Bucknell University food and landscape waste was digested at pilot-scale for over a year to test performance at low and high loading rates, ease of operation at 20% solids, benefits of codigestion of food and landscape waste, and toprovide digestate for studies to assess the curing needs of HSAD digestate. A laboratory-scale curing study was conducted to assess the curing duration required to reduce microbial activity, phytotoxicity, and odors to acceptable levels for subsequent use ofhumus. The characteristics of Bucknell University food and landscape waste were tested approximately weekly for one year, to determine chemical oxygen demand (COD), total solids (TS), volatile solids (VS), and biodegradability (from batch digestion studies). Fats, oil, and grease and total Kjeldahl nitrogen were also tested for some food waste samples. Based on the characterization and biodegradability studies, Bucknell University dining hall food waste is a good candidate for HSAD. During batch digestion studies Bucknell University food waste produced a mean of 288 mL CH4/g COD with a 95%confidence interval of 0.06 mL CH4/g COD. The addition of landscape waste for digestion increased methane production from both food and landscape waste; however, because the landscape waste biodegradability was extremely low the increase was small.Based on an informal waste audit, Bucknell could collect up to 100 tons of food waste from dining facilities each year. The pilot-scale high-solids anaerobic digestion study confirmed that digestion ofBucknell University food waste combined with landscape waste at a low organic loading rate (OLR) of 2 g COD/L reactor volume-day is feasible. During low OLR operation, stable reactor performance was demonstrated through monitoring of biogas production and composition, reactor total and volatile solids, total and soluble chemical oxygendemand, volatile fatty acid content, pH, and bicarbonate alkalinity. Low OLR HSAD of Bucknell University food waste and landscape waste combined produced 232 L CH4/kg COD and 229 L CH4/kg VS. When OLR was increased to high loading (15 g COD/L reactor volume-day) to assess maximum loading conditions, reactor performance became unstable due to ammonia accumulation and subsequent inhibition. The methaneproduction per unit COD also decreased (to 211 L CH4/kg COD fed), although methane production per unit VS increased (to 272 L CH4/kg VS fed). The degree of ammonia inhibition was investigated through respirometry in which reactor digestate was diluted and exposed to varying concentrations of ammonia. Treatments with low ammoniaconcentrations recovered quickly from ammonia inhibition within the reactor. The post-digestion curing process was studied at laboratory-scale, to provide a preliminary assessment of curing duration. Digestate was mixed with woodchips and incubated in an insulated container at 35 °C to simulate full-scale curing self-heatingconditions. Degree of digestate stabilization was determined through oxygen uptake rates, percent O2, temperature, volatile solids, and Solvita Maturity Index. Phytotoxicity was determined through observation of volatile fatty acid and ammonia concentrations.Stabilization of organics and elimination of phytotoxic compounds (after 10–15 days of curing) preceded significant reductions of volatile sulfur compounds (hydrogen sulfide, methanethiol, and dimethyl sulfide) after 15–20 days of curing. Bucknell University food waste has high biodegradability and is suitable for high-solids anaerobic digestion; however, it has a low C:N ratio which can result in ammonia accumulation under some operating conditions. The low biodegradability of Bucknell University landscape waste limits the amount of bioavailable carbon that it can contribute, making it unsuitable for use as a cosubstrate to increase the C:N ratio of food waste. Additional research is indicated to determine other cosubstrates with higher biodegradabilities that may allow successful HSAD of Bucknell University food waste at high OLRs. Some cosubstrates to investigate are office paper, field residues, or grease trap waste. A brief curing period of less than 3 weeks was sufficient to produce viable humus from digestate produced by low OLR HSAD of food and landscape waste.
Resumo:
Laurentide glaciation during the early Pleistocene (~970 ka) dammed the southeast-flowing West Branch of the Susquehanna River (WBSR), scouring bedrock and creating 100-km-long glacial Lake Lesley near the Great Bend at Muncy, Pennsylvania (Ramage et al., 1998). Local drill logs and well data indicate that subsequent paleo-outwash floods and modern fluvial processes have deposited as much as 30 meters of alluvium in this area, but little is known about the valley fill architecture and the bedrock-alluvium interface. By gaining a greater understanding of the bedrock-alluvium interface the project will not only supplement existing depth to bedrock information, but also provide information pertinent to the evolution of the Muncy Valley landscape. This project determined if variations in the thickness of the valley fill were detectable using micro-gravity techniques to map the bedrock-alluvium interface. The gravity method was deemed appropriate due to scale of the study area (~30 km2), ease of operation by a single person, and the available geophysical equipment. A LaCoste and Romberg Gravitron unit was used to collect gravitational field readings at 49 locations over 5 transects across the Muncy Creek and Susquehanna River valleys (approximately 30 km2), with at least two gravity base stations per transect. Precise latitude, longitude and ground surface elevation at each location were measured using an OPUS corrected Trimble RTK-GPS unit. Base stations were chosen based on ease of access due to the necessity of repeat measurements. Gravity measurement locations were selected and marked to provide easy access and repeat measurements. The gravimeter was returned to a base station within every two hours and a looping procedure was used to determine drift and maximize confidence in the gravity measurements. A two-minute calibration reading at each station was used to minimize any tares in the data. The Gravitron digitally recorded finite impulse response filtered gravity measurements every 20 seconds at each station. A measurement period of 15 minutes was used for each base station occupation and a minimum of 5 minutes at all other locations. Longer or multiple measurements were utilized at some sites if drift or other externalities (i.e. train or truck traffic) were effecting readings. Average, median, standard deviation and 95% confidence interval were calculated for each station. Tidal, drift, latitude, free-air, Bouguer and terrain corrections were then applied. The results show that the gravitational field decreases as alluvium thickness increases across the axes of the Susquehanna River and Muncy Creek valleys. However, the location of the gravity low does not correspond with the present-day location of the West Branch of the Susquehanna River (WBSR), suggesting that the WBSR may have been constrained along Bald Eagle Mountain by a glacial lobe originating from the Muncy Creek Valley to the northeast. Using a 3-D inversion model, the topography of the bedrock-alluvium interface was determined over the extent of the study area using a density contrast of -0.8 g/cm3. Our results are consistent with the bedrock geometry of the area, and provide a low-cost, non-invasive and efficient method for exploring the subsurface and for supplementing existing well data.
Resumo:
DNA sequence copy number has been shown to be associated with cancer development and progression. Array-based Comparative Genomic Hybridization (aCGH) is a recent development that seeks to identify the copy number ratio at large numbers of markers across the genome. Due to experimental and biological variations across chromosomes and across hybridizations, current methods are limited to analyses of single chromosomes. We propose a more powerful approach that borrows strength across chromosomes and across hybridizations. We assume a Gaussian mixture model, with a hidden Markov dependence structure, and with random effects to allow for intertumoral variation, as well as intratumoral clonal variation. For ease of computation, we base estimation on a pseudolikelihood function. The method produces quantitative assessments of the likelihood of genetic alterations at each clone, along with a graphical display for simple visual interpretation. We assess the characteristics of the method through simulation studies and through analysis of a brain tumor aCGH data set. We show that the pseudolikelihood approach is superior to existing methods both in detecting small regions of copy number alteration and in accurately classifying regions of change when intratumoral clonal variation is present.
Resumo:
Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.
Resumo:
The exsolution of volatiles from magma maintains an important control on volcanic eruption styles. The nucleation, growth, and connectivity of bubbles during magma ascent provide the driving force behind eruptions, and the rate, volume, and ease of gas exsolution can affect eruptive activity. Volcanic plumes are the observable consequence of this magmatic degassing, and remote sensing techniques allow us to quantify changes in gas exsolution. However, until recently the methods used to measure volcanic plumes did not have the capability of detecting rapid changes in degassing on the scale of standard geophysical observations. The advent of the UV camera now makes high sample rate gas measurements possible. This type of dataset can then be compared to other volcanic observations to provide an in depth picture of degassing mechanisms in the shallow conduit. The goals of this research are to develop a robust methodology for UV camera field measurements of volcanic plumes, and utilize this data in conjunction with seismoacoustic records to illuminate degassing processes. Field and laboratory experiments were conducted to determine the effects of imaging conditions, vignetting, exposure time, calibration technique, and filter usage on the UV camera sulfur dioxide measurements. Using the best practices determined from these studies, a field campaign was undertaken at Volcán de Pacaya, Guatemala. Coincident plume sulfur dioxide measurements, acoustic recordings, and seismic observations were collected and analyzed jointly. The results provide insight into the small explosive features, variations in degassing rate, and plumbing system of this complex volcanic system. This research provides useful information for determining volcanic hazard at Pacaya, and demonstrates the potential of the UV camera in multiparameter studies.
Resumo:
File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.
Resumo:
The Mike Horse mine, in the Huddelston mining district, is fifty-two miles northwest of Helena, Montana. The mine was discovered in 1898 by Joseph Heitmiller. There was only minor production from the date of discovery until 1915; the main drawback being lack of good road.
Resumo:
BACKGROUND Recommendations from international task forces on geriatric assessment emphasize the need for research including validation of cancer-specific geriatric assessment (C-SGA) tools in oncological settings. The objective of this study was to evaluate the feasibility of the SAKK Cancer-Specific Geriatric Assessment (C-SGA) in clinical practice. METHODS A cross sectional study of cancer patients >=65 years old (N = 51) with pathologically confirmed cancer presenting for initiation of chemotherapy treatment (07/01/2009-03/31/2011) at two oncology departments in Swiss canton hospitals: Kantonsspital Graubunden (KSGR N = 25), Kantonsspital St. Gallen (KSSG N = 26). Data was collected using three instruments, the SAKK C-SGA plus physician and patient evaluation forms. The SAKK C-SGA includes six measures covering five geriatric assessment domains (comorbidity, function, psychosocial, nutrition, cognition) using a mix of medical record abstraction (MRA) and patient interview. Five individual domains and one overall SAKK C-SGA score were calculated and dichotomized as below/above literature-based cut-offs. The SAKK C-SGA was evaluated by: patient and physician estimated time to complete, ease of completing, and difficult or unanswered questions. RESULTS Time to complete the patient questionnaire was considered acceptable by almost all (>=96%) patients and physicians. Patients reported slightly shorter times to complete the questionnaire than physicians (17.33 +/- 7.34 vs. 20.59 +/- 6.53 minutes, p = 0.02). Both groups rated the patient questionnaire as easy/fairly easy to complete (91% vs. 84% respectively, p = 0.14) with few difficult or unanswered questions. The MRA took on average 8.32 +/- 4.72 minutes to complete. Physicians (100%) considered time to complete MRA acceptable, 96% rated it as easy/fairly easy to complete. Individual study site populations differed on health-related characteristics (excellent/good physician-rated general health KSGR 71% vs. KSSG 32%, p = 0.007). The overall mean C-SGA score was 2.4 +/- 1.12. Patients at KSGR had lower C-SGA scores (2.00 +/- 1.19 vs. 2.81 +/- 0.90, p = 0.009) and a smaller proportion (28% vs.65%, p = 0.008) was above the C-SGA cut-off score compared to KSSG. CONCLUSIONS These results suggest the SAKK C-SGA is a feasible practical tool for use in clinical practice. It demonstrated discriminative ability based on objective geriatric assessment measures, but additional investigations on use for clinical decision-making are warranted. The SAKK C-SGA also provides important usable domain information for intervention to optimize outcomes in older cancer patients.
Resumo:
The ectoparasitic mite Varroa destructor acting as a virus vector constitutes a central mechanism for losses of managed honey bee, Apis mellifera, colonies. This creates demand for an easy, accurate and cheap diagnostic tool to estimate the impact of viruliferous mites in the field. Here we evaluated whether the clinical signs of the ubiquitous and mite-transmitted deformed wing virus (DWV) can be predictive markers of winter losses. In fall and winter 2007/2008, A.m. carnica workers with apparent wing deformities were counted daily in traps installed on 29 queenright colonies. The data show that colonies which later died had a significantly higher proportion of workers with wing deformities than did those which survived. There was a significant positive correlation between V. destructor infestation levels and the number of workers displaying DWV clinical signs, further supporting the mite's impact on virus infections at the colony level. A logistic regression model suggests that colony size, the number of workers with wing deformities and V. destructor infestation levels constitute predictive markers for winter colony losses in this order of importance and ease of evaluation.
Resumo:
Discrepancies in finite-element model predictions of bone strength may be attributed to the simplified modeling of bone as an isotropic structure due to the resolution limitations of clinical-level Computed Tomography (CT) data. The aim of this study is to calculate the preferential orientations of bone (the principal directions) and the extent to which bone is deposited more in one direction compared to another (degree of anisotropy). Using 100 femoral trabecular samples, the principal directions and degree of anisotropy were calculated with a Gradient Structure Tensor (GST) and a Sobel Structure Tensor (SST) using clinical-level CT. The results were compared against those calculated with the gold standard Mean-Intercept-Length (MIL) fabric tensor using micro-CT. There was no significant difference between the GST and SST in the calculation of the main principal direction (median error=28°), and the error was inversely correlated to the degree of transverse isotropy (r=−0.34, p<0.01). The degree of anisotropy measured using the structure tensors was weakly correlated with the MIL-based measurements (r=0.2, p<0.001). Combining the principal directions with the degree of anisotropy resulted in a significant increase in the correlation of the tensor distributions (r=0.79, p<0.001). Both structure tensors were robust against simulated noise, kernel sizes, and bone volume fraction. We recommend the use of the GST because of its computational efficiency and ease of implementation. This methodology has the promise to predict the structural anisotropy of bone in areas with a high degree of anisotropy, and may improve the in vivo characterization of bone.
Resumo:
Both deepening sleep and evolving epileptic seizures are associated with increasing slow-wave activity. Larger-scale functional networks derived from electroencephalogram indicate that in both transitions dramatic changes of communication between brain areas occur. During seizures these changes seem to be 'condensed', because they evolve more rapidly than during deepening sleep. Here we set out to assess quantitatively functional network dynamics derived from electroencephalogram signals during seizures and normal sleep. Functional networks were derived from electroencephalogram signals from wakefulness, light and deep sleep of 12 volunteers, and from pre-seizure, seizure and post-seizure time periods of 10 patients suffering from focal onset pharmaco-resistant epilepsy. Nodes of the functional network represented electrical signals recorded by single electrodes and were linked if there was non-random cross-correlation between the two corresponding electroencephalogram signals. Network dynamics were then characterized by the evolution of global efficiency, which measures ease of information transmission. Global efficiency was compared with relative delta power. Global efficiency significantly decreased both between light and deep sleep, and between pre-seizure, seizure and post-seizure time periods. The decrease of global efficiency was due to a loss of functional links. While global efficiency decreased significantly, relative delta power increased except between the time periods wakefulness and light sleep, and pre-seizure and seizure. Our results demonstrate that both epileptic seizures and deepening sleep are characterized by dramatic fragmentation of larger-scale functional networks, and further support the similarities between sleep and seizures.
Resumo:
BACKGROUND The majority of radiological reports are lacking a standard structure. Even within a specialized area of radiology, each report has its individual structure with regards to details and order, often containing too much of non-relevant information the referring physician is not interested in. For gathering relevant clinical key parameters in an efficient way or to support long-term therapy monitoring, structured reporting might be advantageous. OBJECTIVE Despite of new technologies in medical information systems, medical reporting is still not dynamic. To improve the quality of communication in radiology reports, a new structured reporting system was developed for abdominal aortic aneurysms (AAA), intended to enhance professional communication by providing the pertinent clinical information in a predefined standard. METHODS Actual state analysis was performed within the departments of radiology and vascular surgery by developing a Technology Acceptance Model. The SWOT (strengths, weaknesses, opportunities, and threats) analysis focused on optimization of the radiology reporting of patients with AAA. Definition of clinical parameters was achieved by interviewing experienced clinicians in radiology and vascular surgery. For evaluation, a focus group (4 radiologists) looked at the reports of 16 patients. The usability and reliability of the method was validated in a real-world test environment in the field of radiology. RESULTS A Web-based application for radiological "structured reporting" (SR) was successfully standardized for AAA. Its organization comprises three main categories: characteristics of pathology and adjacent anatomy, measurements, and additional findings. Using different graphical widgets (eg, drop-down menus) in each category facilitate predefined data entries. Measurement parameters shown in a diagram can be defined for clinical monitoring and be adducted for quick adjudications. Figures for optional use to guide and standardize the reporting are embedded. Analysis of variance shows decreased average time required with SR to obtain a radiological report compared to free-text reporting (P=.0001). Questionnaire responses confirm a high acceptance rate by the user. CONCLUSIONS The new SR system may support efficient radiological reporting for initial diagnosis and follow-up for AAA. Perceived advantages of our SR platform are ease of use, which may lead to more accurate decision support. The new system is open to communicate not only with clinical partners but also with Radiology Information and Hospital Information Systems.