900 resultados para New Keynesian models
Resumo:
PURPOSE: All kinds of blood manipulations aim to increase the total hemoglobin mass (tHb-mass). To establish tHb-mass as an effective screening parameter for detecting blood doping, the knowledge of its normal variation over time is necessary. The aim of the present study, therefore, was to determine the intraindividual variance of tHb-mass in elite athletes during a training year emphasizing off, training, and race seasons at sea level. METHODS: tHb-mass and hemoglobin concentration ([Hb]) were determined in 24 endurance athletes five times during a year and were compared with a control group (n = 6). An analysis of covariance was used to test the effects of training phases, age, gender, competition level, body mass, and training volume. Three error models, based on 1) a total percentage error of measurement, 2) the combination of a typical percentage error (TE) of analytical origin with an absolute SD of biological origin, and 3) between-subject and within-subject variance components as obtained by an analysis of variance, were tested. RESULTS: In addition to the expected influence of performance status, the main results were that the effects of training volume (P = 0.20) and training phases (P = 0.81) on tHb-mass were not significant. We found that within-subject variations mainly have an analytical origin (TE approximately 1.4%) and a very small SD (7.5 g) of biological origin. CONCLUSION: tHb-mass shows very low individual oscillations during a training year (<6%), and these oscillations are below the expected changes in tHb-mass due to Herythropoetin (EPO) application or blood infusion (approximately 10%). The high stability of tHb-mass over a period of 1 year suggests that it should be included in an athlete's biological passport and analyzed by recently developed probabilistic inference techniques that define subject-based reference ranges.
Using 3D surface datasets to understand landslide evolution: From analogue models to real case study
Resumo:
Early detection of landslide surface deformation with 3D remote sensing techniques, as TLS, has become a great challenge during last decade. To improve our understanding of landslide deformation, a series of analogue simulation have been carried out on non-rigid bodies coupled with 3D digitizer. All these experiments have been carried out under controlled conditions, as water level and slope angle inclination. We were able to follow 3D surface deformation suffered by complex landslide bodies from precursory deformation still larger failures. These experiments were the basis for the development of a new algorithm for the quantification of surface deformation using automatic tracking method on discrete points of the slope surface. To validate the algorithm, comparisons were made between manually obtained results and algorithm surface displacement results. Outputs will help in understanding 3D deformation during pre-failure stages and failure mechanisms, which are fundamental aspects for future implementation of 3D remote sensing techniques in early warning systems.
Resumo:
Members of the leucine-rich repeat protein family are involved in diverse functions including protein phosphatase 2-inhibition, cell cycle regulation, gene regulation and signalling pathways. A novel Schistosoma mansoni gene, called SmLANP, presenting homology to various genes coding for proteins that belong to the super family of leucine-rich repeat proteins, was characterized here. SmLANP was 1184bp in length as determined from cDNA and genomic sequences and encoded a 296 amino acid open reading frame that spanning from 6 to 894bp. The predicted amino acid sequence had a calculated molecular weight of 32kDa. Analysis of the predicted sequence indicated the presence of 3 leucine-rich domains (LRR) located in the N-terminal region and an aspartic acid rich region in the C-terminal end. SmLANP transcript is expressed in all stages of the S. mansoni life cycle analyzed, exhibiting the highest expression level in males. The SmLANP protein was expressed in a GST expression system and antibodies raised in mice against the recombinant protein. By immunolocalization assay, using adult worms, it was shown that the protein is mainly present in the cell nucleus through the whole body and strongly expressed along the tegument cell body nuclei of adult worms. As members of this family are usually involved in protein-protein interaction, a yeast two hybrid assay was conducted to identify putative binding partners for SmLANP. Thirty-six possible partners were identified, and a protein ATP synthase subunit alpha was confirmed by pull down assays, as a binding partner of the SmLANP protein.
Resumo:
Geographical Information Systems (GIS) facilitate access to epidemiological data through visualization and may be consulted for the development of mathematical models and analysis by spatial statistics. Variables such as land-cover, land-use, elevations, surface temperatures, rainfall etc. emanating from earth-observing satellites, complement GIS as this information allows the analysis of disease distribution based on environmental characteristics. The strength of this approach issues from the specific environmental requirements of those causative infectious agents, which depend on intermediate hosts for their transmission. The distribution of these diseases is restricted, both by the environmental requirements of their intermediate hosts/vectors and by the ambient temperature inside these hosts, which effectively govern the speed of maturation of the parasite. This paper discusses the current capabilities with regard to satellite data collection in terms of resolution (spatial, temporal and spectral) of the sensor instruments on board drawing attention to the utility of computer-based models of the Earth for epidemiological research. Virtual globes, available from Google and other commercial firms, are superior to conventional maps as they do not only show geographical and man-made features, but also allow instant import of data-sets of specific interest, e.g. environmental parameters, demographic information etc., from the Internet.
Resumo:
Trichomonas vaginalis and Tritrichomonas foetus are parasitic, flagellated protists that inhabit the urogenital tract of humans and bovines, respectively. T. vaginalis causes the most prevalent non-viral sexually transmitted disease worldwide and has been associated with an increased risk for human immunodeficiency virus-1 infection in humans. Infections by T. foetus cause significant losses to the beef industry worldwide due to infertility and spontaneous abortion in cows. Several studies have shown a close association between trichomonads and the epithelium of the urogenital tract. However, little is known concerning the interaction of trichomonads with cells from deeper tissues, such as fibroblasts and muscle cells. Published parasite-host cell interaction studies have reported contradictory results regarding the ability of T. foetus and T. vaginalis to interact with and damage cells of different tissues. In this study, parasite-host cell interactions were examined by culturing primary human fibroblasts obtained from abdominal biopsies performed during plastic surgeries with trichomonads. In addition, mouse 3T3 fibroblasts, primary chick embryo myogenic cells and L6 muscle cells were also used as models of target cells. The parasite-host cell cultures were processed for scanning and transmission electron microscopy and were tested for cell viability and cell death. JC-1 staining, which measures mitochondrial membrane potential, was used to determine whether the parasites induced target cell damage. Terminal deoxynucleotidyltransferase-mediated dUTP nick end labelling staining was used as an indicator of chromatin damage. The colorimetric crystal violet assay was performed to ana-lyse the cytotoxicity induced by the parasite. The results showed that T. foetus and T. vaginalis adhered to and were cytotoxic to both fibroblasts and muscle cells, indicating that trichomonas infection of the connective and muscle tissues is likely to occur; such infections could cause serious risks to the infected host.
Resumo:
Malaria remains a major world health problem following the emergence and spread of Plasmodium falciparum that is resistant to the majority of antimalarial drugs. This problem has since been aggravated by a decreased sensitivity of Plasmodium vivax to chloroquine. This review discusses strategies for evaluating the antimalarial activity of new compounds in vitro and in animal models ranging from conventional tests to the latest high-throughput screening technologies. Antimalarial discovery approaches include the following: the discovery of antimalarials from natural sources, chemical modifications of existing antimalarials, the development of hybrid compounds, testing of commercially available drugs that have been approved for human use for other diseases and molecular modelling using virtual screening technology and docking. Using these approaches, thousands of new drugs with known molecular specificity and active against P. falciparum have been selected. The inhibition of haemozoin formation in vitro, an indirect test that does not require P. falciparum cultures, has been described and this test is believed to improve antimalarial drug discovery. Clinical trials conducted with new funds from international agencies and the participation of several industries committed to the eradication of malaria should accelerate the discovery of drugs that are as effective as artemisinin derivatives, thus providing new hope for the control of malaria.
Resumo:
A variety of host immunogenetic factors appear to influence both an individual's susceptibility to infection with Mycobacterium leprae and the pathologic course of the disease. Animal models can contribute to a better understanding of the role of immunogenetics in leprosy through comparative studies helping to confirm the significance of various identified traits and in deciphering the underlying mechanisms that may be involved in expression of different disease related phenotypes. Genetically engineered mice, with specific immune or biochemical pathway defects, are particularly useful for investigating granuloma formation and resistance to infection and are shedding new light on borderline areas of the leprosy spectrum which are clinically unstable and have a tendency toward immunological complications. Though armadillos are less developed in this regard, these animals are the only other natural hosts of M. leprae and they present a unique opportunity for comparative study of genetic markers and mechanisms associable with disease susceptibility or resistance, especially the neurological aspects of leprosy. In this paper, we review the recent contributions of genetically engineered mice and armadillos toward our understanding of the immunogenetics of leprosy.
Resumo:
Trans-apical aortic valve replacement (AVR) is a new and rapidly growing therapy. However, there are only few training opportunities. The objective of our work is to build an appropriate artificial model of the heart that can replace the use of animals for surgical training in trans-apical AVR procedures. To reduce the necessity for fluoroscopy, we pursued the goal of building a translucent model of the heart that has nature-like dimensions. A simplified 3D model of a human heart with its aortic root was created in silico using the SolidWorks Computer-Aided Design (CAD) program. This heart model was printed using a rapid prototyping system developed by the Fab@Home project and dip-coated two times with dispersion silicone. The translucency of the heart model allows the perception of the deployment area of the valved-stent without using heavy imaging support. The final model was then placed in a human manikin for surgical training on trans-apical AVR procedure. Trans-apical AVR with all the necessary steps (puncture, wiring, catheterization, ballooning etc.) can be realized repeatedly in this setting.
Resumo:
X-ray is a technology that is used for numerous applications in the medical field. The process of X-ray projection gives a 2-dimension (2D) grey-level texture from a 3- dimension (3D) object. Until now no clear demonstration or correlation has positioned the 2D texture analysis as a valid indirect evaluation of the 3D microarchitecture. TBS is a new texture parameter based on the measure of the experimental variogram. TBS evaluates the variation between 2D image grey-levels. The aim of this study was to evaluate existing correlations between 3D bone microarchitecture parameters - evaluated from μCT reconstructions - and the TBS value, calculated on 2D projected images. 30 dried human cadaveric vertebrae were acquired on a micro-scanner (eXplorer Locus, GE) at isotropic resolution of 93 μm. 3D vertebral body models were used. The following 3D microarchitecture parameters were used: Bone volume fraction (BV/TV), Trabecular thickness (TbTh), trabecular space (TbSp), trabecular number (TbN) and connectivity density (ConnD). 3D/2D projections has been done by taking into account the Beer-Lambert Law at X-ray energy of 50, 100, 150 KeV. TBS was assessed on 2D projected images. Correlations between TBS and the 3D microarchitecture parameters were evaluated using a linear regression analysis. Paired T-test is used to assess the X-ray energy effects on TBS. Multiple linear regressions (backward) were used to evaluate relationships between TBS and 3D microarchitecture parameters using a bootstrap process. BV/TV of the sample ranged from 18.5 to 37.6% with an average value at 28.8%. Correlations' analysis showedthat TBSwere strongly correlatedwith ConnD(0.856≤r≤0.862; p<0.001),with TbN (0.805≤r≤0.810; p<0.001) and negatively with TbSp (−0.714≤r≤−0.726; p<0.001), regardless X-ray energy. Results show that lower TBS values are related to "degraded" microarchitecture, with low ConnD, low TbN and a high TbSp. The opposite is also true. X-ray energy has no effect onTBS neither on the correlations betweenTBS and the 3Dmicroarchitecture parameters. In this study, we demonstrated that TBS was significantly correlated with 3D microarchitecture parameters ConnD and TbN, and negatively with TbSp, no matter what X-ray energy has been used. This article is part of a Special Issue entitled ECTS 2011. Disclosure of interest: None declared.
Resumo:
Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Prevention of Trypanosoma cruzi infection in mammals likely depends on either prevention of the invading trypomastigotes from infecting host cells or the rapid recognition and killing of the newly infected cells byT. cruzi-specific T cells. We show here that multiple rounds of infection and cure (by drug therapy) fails to protect mice from reinfection, despite the generation of potent T cell responses. This disappointing result is similar to that obtained with many other vaccine protocols used in attempts to protect animals from T. cruziinfection. We have previously shown that immune recognition ofT. cruziinfection is significantly delayed both at the systemic level and at the level of the infected host cell. The systemic delay appears to be the result of a stealth infection process that fails to trigger substantial innate recognition mechanisms while the delay at the cellular level is related to the immunodominance of highly variable gene family proteins, in particular those of the trans-sialidase family. Here we discuss how these previous studies and the new findings herein impact our thoughts on the potential of prophylactic vaccination to serve a productive role in the prevention of T. cruziinfection and Chagas disease.
Resumo:
BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.
Resumo:
Despite advances in personalized medicine and targeted therapies, therapeutic resistance remains a persistent dilemma encountered by clinicians, scientists and patients. In this article we summarize the highlights of the third Quebec Conference on Therapeutic Resistance in Cancer. This unique meeting provided researchers and clinicians with insights into: intrinsic and acquired resistance; tumor heterogeneity; complexities of biomarker-driven trials; challenges of 'omics data analysis; and models of clinical applications of personalized medicine. Emphasized throughout the conference was the importance of collaborations - between industry and academia, and between basic researchers and clinicians - so that therapeutic resistance can be studied where it matters most, in patients.