971 resultados para Mandibular advancement
Resumo:
In the present study an analytical model has been presented to describe the transient temperature distribution and advancement of the thermal front generated due to the reinjection of heat depleted water in a heterogeneous geothermal reservoir. One dimensional heat transport equation in porous media with advection and longitudinal heat conduction has been solved analytically using Laplace transform technique in a semi infinite medium. The heterogeneity of the porous medium is expressed by the spatial variation of the flow velocity and the longitudinal effective thermal conductivity of the medium. A simpler solution is also derived afterwards neglecting the longitudinal conduction depending on the situation where the contribution to the transient heat transport phenomenon in the porous media is negligible. Solution for a homogeneous aquifer with constant values of the rock and fluid parameters is also derived with an aim to compare the results with that of the heterogeneous one. The effect of some of the parameters involved, on the transient heat transport phenomenon is assessed by observing the variation of the results with different magnitudes of those parameters. Results prove the heterogeneity of the medium, the flow velocity and the longitudinal conductivity to have great influence and porosity to have negligible effect on the transient temperature distribution. (C) 2013 Elsevier Inc. All rights reserved.
Resumo:
We present here observations on diurnal and seasonal variation of mixing ratio and delta C-13 of air CO2, from an urban station-Bangalore (BLR), India, monitored between October 2008 and December 2011. On a diurnal scale, higher mixing ratio with depleted delta C-13 of air CO2 was found for the samples collected during early morning compared to the samples collected during late afternoon. On a seasonal scale, mixing ratio was found to be higher for dry summer months (April-May) and lower for southwest monsoon months (June-July). The maximum enrichment in delta C-13 of air CO2 (-8.04 +/- 0.02aEuro degrees) was seen in October, then delta C-13 started depleting and maximum depletion (-9.31 +/- 0.07aEuro degrees) was observed during dry summer months. Immediately after that an increasing trend in delta C-13 was monitored coincidental with the advancement of southwest monsoon months and maximum enrichment was seen again in October. Although a similar pattern in seasonal variation was observed for the three consecutive years, the dry summer months of 2011 captured distinctly lower amplitude in both the mixing ratio and delta C-13 of air CO2 compared to the dry summer months of 2009 and 2010. This was explained with reduced biomass burning and increased productivity associated with prominent La Nina condition. While compared with the observations from the nearest coastal and open ocean stations-Cabo de Rama (CRI) and Seychelles (SEY), BLR being located within an urban region captured higher amplitude of seasonal variation. The average delta C-13 value of the end member source CO2 was identified based on both diurnal and seasonal scale variation. The delta C-13 value of source CO2 (-24.9 +/- 3aEuro degrees) determined based on diurnal variation was found to differ drastically from the source value (-14.6 +/- 0.7aEuro degrees) identified based on seasonal scale variation. The source CO2 identified based on diurnal variation incorporated both early morning and late afternoon sample; whereas, the source CO2 identified based on seasonal variation included only afternoon samples. Thus, it is evident from the study that sampling timing is one of the important factors while characterizing the composition of end member source CO2 for a particular station. The difference in delta C-13 value of source CO2 obtained based on both diurnal and seasonal variation might be due to possible contribution from cement industry along with fossil fuel / biomass burning as predominant sources for the station along with differential meteorological conditions prevailed.
Resumo:
This work sets forth a `hybrid' discretization scheme utilizing bivariate simplex splines as kernels in a polynomial reproducing scheme constructed over a conventional Finite Element Method (FEM)-like domain discretization based on Delaunay triangulation. Careful construction of the simplex spline knotset ensures the success of the polynomial reproduction procedure at all points in the domain of interest, a significant advancement over its precursor, the DMS-FEM. The shape functions in the proposed method inherit the global continuity (Cp-1) and local supports of the simplex splines of degree p. In the proposed scheme, the triangles comprising the domain discretization also serve as background cells for numerical integration which here are near-aligned to the supports of the shape functions (and their intersections), thus considerably ameliorating an oft-cited source of inaccuracy in the numerical integration of mesh-free (MF) schemes. Numerical experiments show the proposed method requires lower order quadrature rules for accurate evaluation of integrals in the Galerkin weak form. Numerical demonstrations of optimal convergence rates for a few test cases are given and the method is also implemented to compute crack-tip fields in a gradient-enhanced elasticity model.
Resumo:
Background: Computational protein design is a rapidly maturing field within structural biology, with the goal of designing proteins with custom structures and functions. Such proteins could find widespread medical and industrial applications. Here, we have adapted algorithms from the Rosetta software suite to design much larger proteins, based on ideal geometric and topological criteria. Furthermore, we have developed techniques to incorporate symmetry into designed structures. For our first design attempt, we targeted the (alpha/beta)(8) TIM barrel scaffold. We gained novel insights into TIM barrel folding mechanisms from studying natural TIM barrel structures, and from analyzing previous TIM barrel design attempts. Methods: Computational protein design and analysis was performed using the Rosetta software suite and custom scripts. Genes encoding all designed proteins were synthesized and cloned on the pET20-b vector. Standard circular dichroism and gel chromatographic experiments were performed to determine protein biophysical characteristics. 1D NMR and 2D HSQC experiments were performed to determine protein structural characteristics. Results: Extensive protein design simulations coupled with ab initio modeling yielded several all-atom models of ideal, 4-fold symmetric TIM barrels. Four such models were experimentally characterized. The best designed structure (Symmetrin-1) contained a polar, histidine-rich pore, forming an extensive hydrogen bonding network. Symmetrin-1 was easily expressed and readily soluble. It showed circular dichroism spectra characteristic of well-folded alpha/beta proteins. Temperature melting experiments revealed cooperative and reversible unfolding, with a T-m of 44 degrees C and a Gibbs free energy of unfolding (Delta G degrees) of 8.0 kJ/mol. Urea denaturing experiments confirmed these observations, revealing a C-m of 1.6 M and a Delta G degrees of 8.3 kJ/mol. Symmetrin-1 adopted a monomeric conformation, with an apparent molecular weight of 32.12 kDa, and displayed well resolved 1D-NMR spectra. However, the HSQC spectrum revealed somewhat molten characteristics. Conclusions: Despite the detection of molten characteristics, the creation of a soluble, cooperatively folding protein represents an advancement over previous attempts at TIM barrel design. Strategies to further improve Symmetrin-1 are elaborated. Our techniques may be used to create other large, internally symmetric proteins.
Resumo:
Clinical microscopy is a versatile and robust tool used for the diagnosis of a plethora of diseases. However, due to various reasons, it remains inaccessible in resource limited settings. In this paper, we present an automated and cost-effective alternative to microscopy for use in clinical diagnostics. With the use of custom optics and microfluidics, we demonstrate a field-portable imaging flow cytometry system. Using the presented system, we have been able to image 586 cells per second. We demonstrate the clinical relevance of the proposed system by differentiating between suspensions of healthy and sphered RBCs based on high-throughput morphometric analysis. The instrument presented here is a major advancement in the domain of field portable diagnostics as it enables fast and robust quantitative diagnostic testing at the point-of-care.
Resumo:
A new finite difference method for the discretization of the incompressible Navier-Stokes equations is presented. The scheme is constructed on a staggered-mesh grid system. The convection terms are discretized with a fifth-order-accurate upwind compact difference approximation, the viscous terms are discretized with a sixth-order symmetrical compact difference approximation, the continuity equation and the pressure gradient in the momentum equations are discretized with a fourth-order difference approximation on a cell-centered mesh. Time advancement uses a three-stage Runge-Kutta method. The Poisson equation for computing the pressure is solved with preconditioning. Accuracy analysis shows that the new method has high resolving efficiency. Validation of the method by computation of Taylor's vortex array is presented.
Resumo:
Gene microarray technology is highly effective in screening for differential gene expression and has hence become a popular tool in the molecular investigation of cancer. When applied to tumours, molecular characteristics may be correlated with clinical features such as response to chemotherapy. Exploitation of the huge amount of data generated by microarrays is difficult, however, and constitutes a major challenge in the advancement of this methodology. Independent component analysis (ICA), a modern statistical method, allows us to better understand data in such complex and noisy measurement environments. The technique has the potential to significantly increase the quality of the resulting data and improve the biological validity of subsequent analysis. We performed microarray experiments on 31 postmenopausal endometrial biopsies, comprising 11 benign and 20 malignant samples. We compared ICA to the established methods of principal component analysis (PCA), Cyber-T, and SAM. We show that ICA generated patterns that clearly characterized the malignant samples studied, in contrast to PCA. Moreover, ICA improved the biological validity of the genes identified as differentially expressed in endometrial carcinoma, compared to those found by Cyber-T and SAM. In particular, several genes involved in lipid metabolism that are differentially expressed in endometrial carcinoma were only found using this method. This report highlights the potential of ICA in the analysis of microarray data.
Resumo:
Resumen: En este trabajo se examina el rendimiento de estudiantes universitarios en razonamiento formal con y sin interacción social. La muestra N = 83 estuvo conformada por alumnos provenientes de dos universidades (UNMDP y UAA) quienes fueron asignados a dos condiciones: experimental con interacción (n = 40) y grupo control sin interacción (n = 43). El reactivo empleado consistió en una serie de pruebas del Test de Matrices Progresivas escala general y avanzada. Todos los participantes debieron resolver dichas tareas durante varias semanas: primero en forma individual (pretest), luego en forma colectiva presencial y por chat (díadas, condición experimental) y en forma individual (grupo control), y finalmente en forma individual (postest). Se registraron las interacciones que llevaron a cabo los participantes mediante videofilmación y logs del Chat. El rendimiento fue evaluado según la condición asignada: individual o grupal y en los últimos mediante una evaluación interjueces se estableció la calidad de las interacciones. Los resultados corroboran la superioridad de las díadas por sobre la performance individual; la reiteración en la fase de postest de aquellos ítems incorrectamente resueltos en el pretest más el agregado de nuevos ejercicios pusieron de relieve dos zonas de avance o de internalización de los progresos. En cuanto a la calidad de las interacciones, las mismas son coincidentes. Estos resultados abren nuevos interrogantes acerca del conflicto sociocognitivo y su papel para el avance grupal – individual.
Resumo:
<正> 国际尖端材料技术协会(SAMPE:Society for the Advancement of Material and Process Engineering)于1992年10月20—22月在加拿大多伦多市召开了两个国际学术会议:第24届国际SAMPE技术会议(24th Int.SAMPE Techn。Coaf。)和第3届国际金属与金属加工会议(3rd Int.SAMPE Metals and Metals Processin Conf.)。两个会议同时举行,全体会议(Plenary Session)在一个会场举行。前者的主题是“先进材料迎接经济的挑战”;后者的主题是“合成与加工的新进展”.尽管此二会议是在西方发达国家经济很不景
Resumo:
Scientific research revolves around the production, analysis, storage, management, and re-use of data. Data sharing offers important benefits for scientific progress and advancement of knowledge. However, several limitations and barriers in the general adoption of data sharing are still in place. Probably the most important challenge is that data sharing is not yet very common among scholars and is not yet seen as a regular activity among scientists, although important efforts are being invested in promoting data sharing. In addition, there is a relatively low commitment of scholars to cite data. The most important problems and challenges regarding data metrics are closely tied to the more general problems related to data sharing. The development of data metrics is dependent on the growth of data sharing practices, after all it is nothing more than the registration of researchers’ behaviour. At the same time, the availability of proper metrics can help researchers to make their data work more visible. This may subsequently act as an incentive for more data sharing and in this way a virtuous circle may be set in motion. This report seeks to further explore the possibilities of metrics for datasets (i.e. the creation of reliable data metrics) and an effective reward system that aligns the main interests of the main stakeholders involved in the process. The report reviews the current literature on data sharing and data metrics. It presents interviews with the main stakeholders on data sharing and data metrics. It also analyses the existing repositories and tools in the field of data sharing that have special relevance for the promotion and development of data metrics. On the basis of these three pillars, the report presents a number of solutions and necessary developments, as well as a set of recommendations regarding data metrics. The most important recommendations include the general adoption of data sharing and data publication among scholars; the development of a reward system for scientists that includes data metrics; reducing the costs of data publication; reducing existing negative cultural perceptions of researchers regarding data publication; developing standards for preservation, publication, identification and citation of datasets; more coordination of data repository initiatives; and further development of interoperability protocols across different actors.
Resumo:
Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.
Resumo:
The Alliance for Coastal Technologies (ACT) convened a workshop on "Wave Sensor Technologies" in St. Petersburg, Florida on March 7-9, 2007, hosted by the University of South Florida (USF) College of Marine Science, an ACT partner institution. The primary objectives of this workshop were to: 1) define the present state of wave measurement technologies, 2) identify the major impediments to their advancement, and 3) make strategic recommendations for future development and on the necessary steps to integrate wave measurement sensors into operational coastal ocean observing systems. The participants were from various sectors, including research scientists, technology developers and industry providers, and technology users, such as operational coastal managers and coastal decision makers. Waves consistently are ranked as a critical variable for numerous coastal issues, from maritime transportation to beach erosion to habitat restoration. For the purposes of this workshop, the participants focused on measuring "wind waves" (i.e., waves on the water surface, generated by the wind, restored by gravity and existing between approximately 3 and 30-second periods), although it was recognized that a wide range of both forced and free waves exist on and in the oceans. Also, whereas the workshop put emphasis on the nearshore coastal component of wave measurements, the participants also stressed the importance of open ocean surface waves measurement. Wave sensor technologies that are presently available for both environments include bottom-mounted pressure gauges, surface following buoys, wave staffs, acoustic Doppler current profilers, and shore-based remote sensing radar instruments. One of the recurring themes of workshop discussions was the dichotomous nature of wave data users. The two separate groups, open ocean wave data users and the nearshore/coastal wave data users, have different requirements. Generally, the user requirements increase both in spatial/temporal resolution and precision as one moves closer to shore. Most ocean going mariners are adequately satisfied with measurements of wave period and height and a wave general direction. However, most coastal and nearshore users require at least the first five Fourier parameters ("First 5"): wave energy and the first four directional Fourier coefficients. Furthermore, wave research scientists would like sensors capable of providing measurements beyond the first four Fourier coefficients. It was debated whether or not high precision wave observations in one location can take the place of a less precise measurement at a different location. This could be accomplished by advancing wave models and using wave models to extend data to nearby areas. However, the consensus was that models are no substitution for in situ wave data.[PDF contains 26 pages]
Resumo:
182 p. : il.