935 resultados para Simple overlap model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light-induced lipophilic porphyrin/aqueous acceptor charge separation across a single lipid-water interface can pump protons across the lipid bilayer when the hydrophobic weak acids, carbonylcyanide m-chlorophenylhydrazone and its p-trifluoromethoxyphenyl analogue, are present. These compounds act as proton carriers across lipid bilayers. In their symmetric presence across the bilayer, the positive currents and voltages produced by the photogeneration of porphyrin cations are replaced by larger negative currents and voltages. The maximum negative current and voltage occur at the pH of maximum dark conductance. The reversed larger current and voltage show a positive ionic charge transport in the same direction as the electron transfer. This transport can form an ion concentration gradient. The movement of protons is verified by an unusual D2O isotope effect that increases the negative ionic current by 2- to 3-fold. These effects suggest that an interfacial pK shift of the weak acid caused by the local electric field of photoformed porphyrin cations/acceptor anions functions as the driving force. The estimated pumping efficiency is 10-30%. Time-resolved results show that proton pumping across the bilayer occurs on the millisecond time scale, similar to that of biological pumps. This light-driven proteinless pump offers a simple model for a prebiological energy transducer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theoretical advantages of nonparametric logarithm of odds to map polygenic diseases are supported by tests of the beta model that depends on a single logistic parameter and is the only model under which paternal and maternal transmissions to sibs of specified phenotypes are independent. Although it does not precisely describe recurrence risks in monozygous twins, the beta model has greater power to detect family resemblance or linkage than the more general delta model which describes the probability of 0, 1, or 2 alleles identical by descent (ibd) with two parameters. Available data on ibd in sibs are consistent with the beta model, but not with the equally parsimonious but less powerful gamma model that assumes a fixed probability of 1/2 for 1 allele ibd. Additivity of loci on the liability scale is not disproven. A simple equivalence extends the beta model to multipoint analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlorarachniophyte algae contain a complex, multi-membraned chloroplast derived from the endosymbiosis of a eukaryotic alga. The vestigial nucleus of the endosymbiont, called the nucleomorph, contains only three small linear chromosomes with a haploid genome size of 380 kb and is the smallest known eukaryotic genome. Nucleotide sequence data from a subtelomeric fragment of chromosome III were analyzed as a preliminary investigation of the coding capacity of this vestigial genome. Several housekeeping genes including U6 small nuclear RNA (snRNA), ribosomal proteins S4 and S13, a core protein of the spliceosome [small nuclear ribonucleoprotein (snRNP) E], and a cip-like protease (clpP) were identified. Expression of these genes was confirmed by combinations of Northern blot analysis, in situ hybridization, immunocytochemistry, and cDNA analysis. The protein-encoding genes are typically eukaryotic in overall structure and their messenger RNAs are polyadenylylated. A novel feature is the abundance of 18-, 19-, or 20-nucleotide introns; the smallest spliceosomal introns known. Two of the genes, U6 and S13, overlap while another two genes, snRNP E and clpP, are cotranscribed in a single mRNA. The overall gene organization is extraordinarily compact, making the nucleomorph a unique model for eukaryotic genomics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently individual two-headed kinesin molecules have been studied in in vitro motility assays revealing a number of their peculiar transport properties. In this paper we propose a simple and robust model for the kinesin stepping process with elastically coupled Brownian heads that show all of these properties. The analytic and numerical treatment of our model results in a very good fit to the experimental data and practically has no free parameters. Changing the values of the parameters in the restricted range allowed by the related experimental estimates has almost no effect on the shape of the curves and results mainly in a variation of the zero load velocity that can be directly fitted to the measured data. In addition, the model is consistent with the measured pathway of the kinesin ATPase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most models of tumorigenesis assume that the tumor grows by increased cell division. In these models, it is generally supposed that daughter cells behave as do their parents, and cell numbers have clear potential for exponential growth. We have constructed simple mathematical models of tumorigenesis through failure of programmed cell death (PCD) or differentiation. These models do not assume that descendant cells behave as their parents do. The models predict that exponential growth in cell numbers does sometimes occur, usually when stem cells fail to die or differentiate. At other times, exponential growth does not occur: instead, the number of cells in the population reaches a new, higher equilibrium. This behavior is predicted when fully differentiated cells fail to undergo PCD. When cells of intermediate differentiation fail to die or to differentiate further, the values of growth parameters determine whether growth is exponential or leads to a new equilibrium. The predictions of the model are sensitive to small differences in growth parameters. Failure of PCD and differentiation, leading to a new equilibrium number of cells, may explain many aspects of tumor behavior--for example, early premalignant lesions such as cervical intraepithelial neoplasia, the fact that some tumors very rarely become malignant, the observation of plateaux in the growth of some solid tumors, and, finally, long lag phases of growth until mutations arise that eventually result in exponential growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a model of a nonlinear double-barrier structure to describe in a simple way the effects of electron-electron scattering while remaining analytically tractable. The model is based on a generalized effective-mass equation where a nonlinear local field interaction is introduced to account for those inelastic scattering phenomena. Resonance peaks seen in the transmission coefficient spectra for the linear case appear shifted to higher energies depending on the magnitude of the nonlinear coupling. Our results are in good agreement with self-consistent solutions of the Schrodinger and Poisson equations. The calculation procedure is seen to be very fast, which makes our technique a good candidate for a rapid approximate analysis of these structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ECHAM-1 T21/LSG coupled ocean-atmosphere general circulation model (GCM) is used to simulate climatic conditions at the last interglacial maximum (Eemian. 125 kyr BP). The results reflect thc expected surface temperature changes (with respect to the control run) due to the amplification (reduction) of the seasonal cycle of insolation in the Northern (Southern) Hemisphere. A number of simulated features agree with previous results from atmospheric GCM simulations e.g. intensified summer southwest monsoons) except in the Northern Hemisphere poleward of 30 degrees N. where dynamical feedback, in the North Atlantic and North Pacific increase zonal temperatures about 1 degrees C above what would be predicted from simple energy balance considerations. As this is the same area where most of the terrestrial geological data originate, this result suggests that previous estimates of Eemian global average temperature might have been biased by sample distribution. This conclusion is supported by the fact that the estimated global temperature increase of only 0.3 degrees C greater than the control run ha, been previously shown to be consistent a with CLIMAP sea surface temperature estimates. Although the Northern Hemisphere summer monsoon is intensified. globally averaged precipitation over land is within about 1% of the present, contravening some geological inferences bur not the deep-sea delta(13)C estimates of terrestrial carbon storage changes. Winter circulation changes in the northern Arabian Sea. driven by strong cooling on land, are as large as summer circulation changes that are the usual focus of interest, suggesting that interpreting variations in the Arabian Sea. sedimentary record solely in terms of the summer monsoon response could sometimes lead to errors. A small monsoonal response over northern South America suggests that interglacial paleotrends in this region were not just due to El Nino variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe Fénix, a data model for exchanging information between Natural Language Processing applications. The format proposed is intended to be flexible enough to cover both current and future data structures employed in the field of Computational Linguistics. The Fénix architecture is divided into four separate layers: conceptual, logical, persistence and physical. This division provides a simple interface to abstract the users from low-level implementation details, such as programming languages and data storage employed, allowing them to focus in the concepts and processes to be modelled. The Fénix architecture is accompanied by a set of programming libraries to facilitate the access and manipulation of the structures created in this framework. We will also show how this architecture has been already successfully applied in different research projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous improvement of management and assessment processes for curricular external internships has led a group of university teachers specialised in this area to develop a mixed model of measurement that combines the verification of skill acquisition by those students choosing external internships with the satisfaction of the parties involved in that process. They included academics, educational tutors of companies and organisations and administration and services personnel in the latter category. The experience, developed within University of Alicante, has been carried out in the degrees of Business Administration and Management, Business Studies, Economics, Advertising and Public Relations, Sociology and Social Work, all part of the Faculty of Economics and Business. By designing and managing closed standardised interviews and other research tools, validated outside the centre, a system of continuous improvement and quality assurance has been created, clearly contributing to the gradual increase in the number of students with internships in this Faculty, as well as to the improvement in satisfaction, efficiency and efficacy indicators at a global level. As this experience of educational innovation has shown, the acquisition of curricular knowledge, skills, abilities and competences by the students is directly correlated with the satisfaction of those parties involved in a process that takes the student beyond the physical borders of a university campus. Ensuring the latter is a task made easier by the implementation of a mixed assessment method, combining continuous and final assessment, and characterised by its rigorousness and simple management. This report presents that model, subject in turn to a persistent and continuous control, a model all parties involved in the external internships are taking part of. Its short-term results imply an increase, estimated at 15% for the last course, in the number of students choosing curricular internships and, for the medium and long-term, a major interweaving between the academic world and its social and productive environment, both in the business and institutional areas. The potentiality of this assessment model does not lie only in the quality of its measurement tools, but also in the effects from its use in the various groups and in the actions that are carried out as a result of its implementation and which, without any doubt and as it is shown below, are the real guarantee of a continuous improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a simple algorithm for assessing the validity of the RVoG model for PolInSAR-based inversion techniques. This approach makes use of two important features characterizing a homogeneous random volume over a ground surface, i.e., the independence on polarization states of wave propagation through the volume and the structure of the polarimetric interferometric coherency matrix. These two features have led to two different methods proposed in the literature for retrieving the topographic phase within natural covers, i.e., the well-known line fitting procedure and the observation of the (1, 2) element of the polarimetric interferometric coherency matrix. We show that differences between outputs from both approaches can be interpreted in terms of the PolInSAR modeling based on the Freeman-Durden concept, and this leads to the definition of a RVoG/non-RVoG test. The algorithm is tested with both indoor and airborne data over agricultural and tropical forest areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose the use of the neural gas (NG), a neural network that uses an unsupervised Competitive Hebbian Learning (CHL) rule, to develop a reverse engineering process. This is a simple and accurate method to reconstruct objects from point clouds obtained from multiple overlapping views using low-cost sensors. In contrast to other methods that may need several stages that include downsampling, noise filtering and many other tasks, the NG automatically obtains the 3D model of the scanned objects. To demonstrate the validity of our proposal we tested our method with several models and performed a study of the neural network parameterization computing the quality of representation and also comparing results with other neural methods like growing neural gas and Kohonen maps or classical methods like Voxel Grid. We also reconstructed models acquired by low cost sensors that can be used in virtual and augmented reality environments for redesign or manipulation purposes. Since the NG algorithm has a strong computational cost we propose its acceleration. We have redesigned and implemented the NG learning algorithm to fit it onto Graphics Processing Units using CUDA. A speed-up of 180× faster is obtained compared to the sequential CPU version.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wood is a natural and traditional building material, as popular today as ever, and presents advantages. Physically, wood is strong and stiff, but compared with other materials like steel is light and flexible. Wood material can absorb sound very effectively and it is a relatively good heat insulator. But dry wood burns quite easily and produces a great deal of heat energy. The main disadvantage is the high level of combustion when exposed to fire, where the point at which it catches fire is from 200–400°C. After fire exposure, is need to determine if the charred wooden structures are safe for future use. Design methods require the use of computer modelling to predict the fire exposure and the capacity of structures to resist those action. Also, large or small scale experimental tests are necessary to calibrate and verify the numerical models. The thermal model is essential for wood structures exposed to fire, because predicts the charring rate as a function of fire exposure. The charring rate calculation of most structural wood elements allows simple calculations, but is more complicated for situations where the fire exposure is non-standard and in wood elements protected with other materials. In this work, the authors present different case studies using numerical models, that will help professionals analysing woods elements and the type of information needed to decide whether the charred structures are adequate or not to use. Different thermal models representing wooden cellular slabs, used in building construction for ceiling or flooring compartments, will be analysed and submitted to different fire scenarios (with the standard fire curve exposure). The same numerical models, considering insulation material inside the wooden cellular slabs, will be tested to compare and determine the fire time resistance and the charring rate calculation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wood is a natural and traditional building material, as popular today as ever, and presents advantages. Physically, wood is strong and stiff, but compared with other materiais like steel is light and flexible. Wood material can absorb sound very effectively and it is a relatively good heat insulator. But dry wood does bum quite easily md produces a great deal ofheat energy. The main disadvantage is the high levei ofcombustion when exposed to fíre, where the point at which it catches fire is fi-om 200-400°C. After fu-e exposure, is need to determine if the charred wooden stmctures are safe for future use. Design methods require the use ofcomputer modelling to predict the fíre exposure and the capacity ofstructures to resist fhose action. Also, large or small scale experimental tests are necessary to calibrate and verify the numerical models. The thermal model is essential for wood stmctures exposed to fire, because predicts the charring rate as a fünction offire exposure. The charring rate calculation ofmost stmctural wood elements allows simple calculations, but is more complicated for situations where the fire exposure is non-standard and in wood elements protected with other materiais.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

nlcheck is a simple diagnostic tool that can be used after fitting a model to quickly check the linearity assumption for a given predictor. nlcheck categorizes the predictor into bins, refits the model including dummy variables for the bins, and then performs a joint Wald test for the added parameters. Alternative, nlcheck uses linear splines for the adaptive model. Support for discrete variables is also provided. Optionally, nlcheck also displays a graph of the adjusted linear predictions from the original model and the adaptive model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-03