911 resultados para Technicolor and Composite Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this research the supportive role of the family in coping with everyday problems was studied using two large data sets. The results show the importance of the structural aspect of social support. Mapping individual preferences to support referents showed the crucial role of spouse and parents in solving everyday problems. The individual choices of particular support referents could be fairly accurately predicted from knowledge of the composition of the family, in both categorical regression and logit models. The far lower predictability of the criterion variable was shown using a wide range of socioeconomic, social and demographic indicators. Residence in small cities and indicators of extreme occupational strata were particularly predictive of the choice of support referent. The supportive role of the family was also traced in the personal projects of young adults, which were seen as ecological, natural and dynamic middle-level units of analysis of personality. Different aspects of personal projects, including reliance on social support referents, turned out to be highly interrelated. One the one hand, expectations of support were determined by the content of the project, and on the other, expected social support also influences the content of the project. Sivuha sees this as one of the ways others can enter self-structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Class II cavities were prepared in extracted lower molars filled and cured in three 2-mm increments using a metal matrix. Three composites (Spectrum TPH A4, Ceram X mono M7 and Tetric Ceram A4) were cured with both the SmartLite PS LED LCU and the Spectrum 800 continuous cure halogen LCU using curing cycles of 10, 20 and 40 seconds. Each increment was cured before adding the next. After a seven-day incubation period, the composite specimens were removed from the teeth, embedded in self-curing resin and ground to half the orofacial width. Knoop microhardness was determined 100, 200, 500, 1000, 1500, 2500, 3500, 4500 and 5500 microm from the occlusal surface at a distance of 150 microm and 1000 microm from the metal matrix. The total degree of polymerization of a composite specimen for any given curing time and curing light was determined by calculating the area under the hardness curve. Hardness values 150 microm from the metal matrix never reached maximum values and were generally lower than those 1000 microm from the matrix. The hardest composite was usually encountered between 200 microm and 1000 microm from the occlusal surface. For every composite-curing time combination, there was an increase in microhardness at the top of each increment (measurements at 500, 2500 and 4500 microm) and a decrease towards the bottom of each increment (measurements at 1500, 3500 and 5500 microm). Longer curing times were usually combined with harder composite samples. Spectrum TPH composite was the only composite showing a satisfactory degree of polymerization for all three curing times and both LCUs. Multiple linear regression showed that only the curing time (p < 0.001) and composite material (p < 0.001) had a significant association with the degree of polymerization. The degree of polymerization achieved by the LED LCU was not significantly different from that achieved by the halogen LCU (p = 0.54).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To assess survival rates and complications of root-filled teeth restored with or without post-and-core systems over a mean observation period of >or=4 years. METHODOLOGY: A total of 325 single- and multirooted teeth in 183 subjects treated in a private practice were root filled and restored with either a cast post-and-core or with a prefabricated titanium post and composite core. Root-filled teeth without post-retained restorations served as controls. The restored teeth served as abutments for single unit metal-ceramic or composite crowns or fixed bridges. Teeth supporting cantilever bridges, overdentures or telescopic crowns were excluded. RESULTS: Seventeen teeth in 17 subjects were lost to follow-up (17/325: 5.2%). The mean observation period was 5.2 +/- 1.8 (SD) years for restorations with titanium posts, 6.2 +/- 2.0 (SD) years for cast post-and-cores and 4.4 +/- 1.7 (SD) years for teeth without posts. Overall, 54% of build-ups included the incorporation of a titanium post and 26.5% the cementation of a cast post-and-core. The remaining 19.5% of the teeth were restored without intraradicular retention. The adjusted 5-year tooth survival rate amounted to 92.5% for teeth restored with titanium posts, to 97.1% for teeth restored with cast post-and-cores and to 94.3% for teeth without post restorations, respectively. The most frequent complications included root fracture (6.2%), recurrent caries (1.9%), post-treatment periradicular disease (1.6%) and loss of retention (1.3%). CONCLUSION: Provided that high-quality root canal treatment and restorative protocols are implemented, high survival and low complication rates of single- and multirooted root-filled teeth used as abutments for fixed restorations can be expected after a mean observation period of >or=4 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: The aim of this study was to examine the effect of pre-warmed composite on the microhardness and marginal adaptation. Methods: Ninety six identical class II cavities were prepared in extracted human molars and filled/cured in three 2 mm increments using a metal matrix. Two composites (Tetric Evo Ceram (IvoclarVivadent) and ELS(Saremco)) were cured with a LED curing unit (Bluephase (IvoclarVivadent)) using curing cycles of 20 and 40 seconds. The composite was used at room temperature or pre-warmed at 54.5ºC (Calset(AdDent)). Twelve teeth were filled for every composite-curing time-composite temperature combination. The teeth were thermocycled (1000 cycles at 5º and 55ºC) and then stored at 37° C for seven days . Dye penetration (basic fuchsine 5% for 8 hours) was measured using a score scale. Knoop microhardness was determined 100, 200, 500, 1000, 1500, 2500, 3500, 4500 and 5500µm from the occlusal surface at a distance of 150 and 1000µm from the metal matrix. The total degree of polymerization of a composite specimen was determined by calculating the area under the hardness curve. Results: Statistical analyses showed no difference in marginal adaptation (p>0.05). Hardness values at 150µm from the matrix were lower than those at 1000µm. There was an increase of the microhardness at the top of each increment and decrease towards the bottom of each increment. Longer curing times resulted in harder composite samples. Multiple linear regression showed that only the curing time (p<0.001) and composite material (p<0.001) had a significant association with the degree of polymerization. The degree of polymerization was not influenced by pre-warming the composite at a temperature of 54.5ºC (p=4.86). Conclusion: Polymerization time can not be reduced by pre-warming the composite on a temperature of 54.5ºC. The marginal adaptation is not compromised by pre-warming the composite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Echicetin, a heterodimeric protein from the venom of Echis carinatus, binds to platelet glycoprotein Ib (GPIb) and so inhibits platelet aggregation or agglutination induced by various platelet agonists acting via GPIb. The amino acid sequence of the beta subunit of echicetin has been reported and found to belong to the recently identified snake venom subclass of the C-type lectin protein family. Echicetin alpha and beta subunits were purified. N-terminal sequence analysis provided direct evidence that the protein purified was echicetin. The paper presents the complete amino acid sequence of the alpha subunit and computer models of the alpha and beta subunits. The sequence of alpha echicetin is highly similar to the alpha and beta chains of various heterodimeric and homodimeric C-type lectins. Neither of the fully reduced and alkylated alpha or beta subunits of echicetin inhibited the platelet agglutination induced by von Willebrand factor-ristocetin or alpha-thrombin. Earlier reports about the inhibitory activity of reduced and alkylated echicetin beta subunit might have been due to partial reduction of the protein.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The Swiss government decided to freeze new accreditations for physicians in private practice in Switzerland based on the assumption that demand-induced health care spending may be cut by limiting care offers. This legislation initiated an ongoing controversial public debate in Switzerland. The aim of this study is therefore the determination of socio-demographic and health system-related factors of per capita consultation rates with primary care physicians in the multicultural population of Switzerland. Methods The data were derived from the complete claims data of Swiss health insurers for 2004 and included 21.4 million consultations provided by 6564 Swiss primary care physicians on a fee-for-service basis. Socio-demographic data were obtained from the Swiss Federal Statistical Office. Utilisation-based health service areas were created and were used as observational units for statistical procedures. Multivariate and hierarchical models were applied to analyze the data. Results Models within the study allowed the definition of 1018 primary care service areas with a median population of 3754 and an average per capita consultation rate of 2.95 per year. Statistical models yielded significant effects for various geographical, socio-demographic and cultural factors. The regional density of physicians in independent practice was also significantly associated with annual consultation rates and indicated an associated increase 0.10 for each additional primary care physician in a population of 10,000 inhabitants. Considerable differences across Swiss language regions were observed with reference to the supply of ambulatory health resources provided either by primary care physicians, specialists, or hospital-based ambulatory care. Conclusion The study documents a large small-area variation in utilisation and provision of health care resources in Switzerland. Effects of physician density appeared to be strongly related to Swiss language regions and may be rooted in the different cultural backgrounds of the served populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Several epidemiological studies show that inhalation of particulate matter may cause increased pulmonary morbidity and mortality. Of particular interest are the ultrafine particles that are particularly toxic. In addition more and more nanoparticles are released into the environment; however, the potential health effects of these nanoparticles are yet unknown. OBJECTIVES: To avoid particle toxicity studies with animals many cell culture models have been developed during the past years. METHODS: This review focuses on the most commonly used in vitro epithelial airway and alveolar models to study particle-cell interactions and particle toxicity and highlights advantages and disadvantages of the different models. RESULTS/CONCLUSION: There are many lung cell culture models but none of these models seems to be perfect. However, they might be a great tool to perform basic research or toxicity tests. The focus here is on 3D and co-culture models, which seem to be more realistic than monocultures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of climate change are expected to be very severe in arid regions. The Sonora River Basin, in the northwestern state of Sonora, Mexico, is likely to be severely affected. Some of the anticipated effects include precipitation variability, intense storm events, higher overall temperatures, and less available water. In addition, population in Sonora, specifically the capital city of Hermosillo, is increasing at a 1.5% rate and current populations are near 700,000. With the reduction in water availability and an increase in population, Sonora, Mexico is expected to experience severe water resource issues in the near future. In anticipation of these changes, research is being conducted in an attempt to improve water management in the Sonora River Basin, located in the northwestern part of Sonora. This research involves participatory modeling techniques designed to increase water manager awareness of hydrological models and their use as integrative tools for water resource management. This study was conducted as preliminary research for the participatory modeling grant in order to gather useful information on the population being studied. This thesis presents research from thirty-four in-depth interviews with water managers, citizens, and agricultural producers in Sonora, Mexico. Data was collected on perceptions of water quantity and quality in the basin, thoughts on current water management practices, perceptions of climate change and its management, experience with, knowledge of, and trust in hydrological models as water management tools. Results showed that the majority of interviewees thought there was not enough water to satisfy their daily needs. Most respondents also agreed that the water available was of good quality, but that current management of water resources was ineffective. Nearly all interviewees were aware of climate change and thought it to be anthropogenic. May reported experiencing higher temperatures, precipitation changes, and higher water scarcity and attributed those fluctuations to climate change. 65% of interviewees were at least somewhat familiar with hydrological models, though only 28% had ever used them or their output. Even with model usage results being low, 100% of respondents believed hydrological models to be very useful water management tools. Understanding how water, climate change, and hydrological models are perceived by this population of people is essential to improving their water management practices in the face of climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Satellite measurement validations, climate models, atmospheric radiative transfer models and cloud models, all depend on accurate measurements of cloud particle size distributions, number densities, spatial distributions, and other parameters relevant to cloud microphysical processes. And many airborne instruments designed to measure size distributions and concentrations of cloud particles have large uncertainties in measuring number densities and size distributions of small ice crystals. HOLODEC (Holographic Detector for Clouds) is a new instrument that does not have many of these uncertainties and makes possible measurements that other probes have never made. The advantages of HOLODEC are inherent to the holographic method. In this dissertation, I describe HOLODEC, its in-situ measurements of cloud particles, and the results of its test flights. I present a hologram reconstruction algorithm that has a sample spacing that does not vary with reconstruction distance. This reconstruction algorithm accurately reconstructs the field to all distances inside a typical holographic measurement volume as proven by comparison with analytical solutions to the Huygens-Fresnel diffraction integral. It is fast to compute, and has diffraction limited resolution. Further, described herein is an algorithm that can find the position along the optical axis of small particles as well as large complex-shaped particles. I explain an implementation of these algorithms that is an efficient, robust, automated program that allows us to process holograms on a computer cluster in a reasonable time. I show size distributions and number densities of cloud particles, and show that they are within the uncertainty of independent measurements made with another measurement method. The feasibility of another cloud particle instrument that has advantages over new standard instruments is proven. These advantages include a unique ability to detect shattered particles using three-dimensional positions, and a sample volume size that does not vary with particle size or airspeed. It also is able to yield two-dimensional particle profiles using the same measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gas sensors have been used widely in different important area including industrial control, environmental monitoring, counter-terrorism and chemical production. Micro-fabrication offers a promising way to achieve sensitive and inexpensive gas sensors. Over the years, various MEMS gas sensors have been investigated and fabricated. One significant type of MEMS gas sensors is based on mass change detection and the integration with specific polymer. This dissertation aims to make contributions to the design and fabrication of MEMS resonant mass sensors with capacitance actuation and sensing that lead to improved sensitivity. To accomplish this goal, the research has several objectives: (1) Define an effective measure for evaluating the sensitivity of resonant mass devices; (2) Model the effects of air damping on microcantilevers and validate models using laser measurement system (3) Develop design guidelines for improving sensitivity in the presence of air damping; (4) Characterize the degree of uncertainty in performance arising from fabrication variation for one or more process sequences, and establish design guidelines for improved robustness. Work has been completed toward these objectives. An evaluation measure has been developed and compared to an RMS based measure. Analytic models of air damping for parallel plate that include holes are compared with a COMSOL model. The models have been used to identify cantilever design parameters that maximize sensitivity. Additional designs have been modeled with COMSOL and the development of an analytical model for Fixed-free cantilever geometries with holes has been developed. Two process flows have been implemented and compared. A number of cantilever designs have been fabricated and the uncertainty in process has been investigated. Variability from processing have been evaluated and characterized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.