916 resultados para Make to Stock
Resumo:
The geometry and connectivity of fractures exert a strong influence on the flow and transport properties of fracture networks. We present a novel approach to stochastically generate three-dimensional discrete networks of connected fractures that are conditioned to hydrological and geophysical data. A hierarchical rejection sampling algorithm is used to draw realizations from the posterior probability density function at different conditioning levels. The method is applied to a well-studied granitic formation using data acquired within two boreholes located 6 m apart. The prior models include 27 fractures with their geometry (position and orientation) bounded by information derived from single-hole ground-penetrating radar (GPR) data acquired during saline tracer tests and optical televiewer logs. Eleven cross-hole hydraulic connections between fractures in neighboring boreholes and the order in which the tracer arrives at different fractures are used for conditioning. Furthermore, the networks are conditioned to the observed relative hydraulic importance of the different hydraulic connections by numerically simulating the flow response. Among the conditioning data considered, constraints on the relative flow contributions were the most effective in determining the variability among the network realizations. Nevertheless, we find that the posterior model space is strongly determined by the imposed prior bounds. Strong prior bounds were derived from GPR measurements and helped to make the approach computationally feasible. We analyze a set of 230 posterior realizations that reproduce all data given their uncertainties assuming the same uniform transmissivity in all fractures. The posterior models provide valuable statistics on length scales and density of connected fractures, as well as their connectivity. In an additional analysis, effective transmissivity estimates of the posterior realizations indicate a strong influence of the DFN structure, in that it induces large variations of equivalent transmissivities between realizations. The transmissivity estimates agree well with previous estimates at the site based on pumping, flowmeter and temperature data.
Resumo:
In 2008, we have celebrated the centenary of the discovery of Toxoplasma gondii.Although this ubiquitous protozoan can generate devastating damage in foetuses and newborns, its treatment is the only field in which we have made little progress, despite a huge body of research, and has not yet been validated. Pregnant women who seroconvert are generally given spiramycine in order to reduce the risk of vertical transmission. However, to date, we have no evidence of the efficacy of this treatment because no randomized controlled trials have as yet been conducted. When foetal contamination is demonstrated, pyrimethamine, in association with sulfadoxine or sulfadiazine, is normally prescribed, but the effectiveness of this treatment remains to be shown. With regard to postnatal treatment, opinions vary considerably in terms of drugs, regimens and length of therapy. Similarly, we do not have clear evidence to support routine antibiotic treatment of acute ocular toxoplasmosis. We must be aware that pregnant women and newborns are currently being given empirically potentially toxic drugs that have no proven benefit. We must make progress in this field through well-designed collaborative studies and by drawing the attention of policy makers to this disastrous and unsustainable situation.
Resumo:
Time series regression models are especially suitable in epidemiology for evaluating short-term effects of time-varying exposures on health. The problem is that potential for confounding in time series regression is very high. Thus, it is important that trend and seasonality are properly accounted for. Our paper reviews the statistical models commonly used in time-series regression methods, specially allowing for serial correlation, make them potentially useful for selected epidemiological purposes. In particular, we discuss the use of time-series regression for counts using a wide range Generalised Linear Models as well as Generalised Additive Models. In addition, recently critical points in using statistical software for GAM were stressed, and reanalyses of time series data on air pollution and health were performed in order to update already published. Applications are offered through an example on the relationship between asthma emergency admissions and photochemical air pollutants
Resumo:
Background: In children, video game experience improves spatial performance, a predictor of surgical performance. This study aims at comparing laparoscopic virtual reality (VR) task performance of children with different levels of experience in video games and residents. Participants and methods: A total of 32 children (8.4 to 12.1 years), 20 residents, and 14 board-certified surgeons (total n = 66) performed several VR and 2 conventional tasks (cube/spatial and pegboard/fine motor). Performance between the groups was compared (primary outcome). VR performance was correlated with conventional task performance (secondary outcome). Results: Lowest VR performance was found in children with low video game experience, followed by those with high video game experience, residents, and board-certified surgeons. VR performance correlated well with the spatial test and moderately with the fine motor test. Conclusions: The use of computer games can be considered not only as pure entertainment but may also contribute to the development of skills relevant for adequate performance in VR laparoscopic tasks. Spatial skills are relevant for VR laparoscopic task performance.
Resumo:
This paper presents a theoretical and empirical analysis of strategic competition in retail banking when some of the financial firms are non-profit organisations that invest in social activities. Banking literature about competition is fairly large, but the strategic interaction between profit maximizing and non profit maximizers has not been extensively analysed except for Purroy and Salas (1999). In this paper, a completely different approach is taken. An adaptation of Hotelling’s two stage model of spatial competition is developed to take into account consumer perceptions respect to the two different types of financial institutions. The empirical analysis confirms that consumers take into account other features different from the price, such as social contribution or closer service to make a deposit or mortgage decision. These conclusions are of interest in the debate about a firm’s social or ethical activities. It is shown that if consumers value social activities, firms can improv
Resumo:
We take stock of the present position of compositional data analysis, of what has beenachieved in the last 20 years, and then make suggestions as to what may be sensibleavenues of future research. We take an uncompromisingly applied mathematical view,that the challenge of solving practical problems should motivate our theoreticalresearch; and that any new theory should be thoroughly investigated to see if it mayprovide answers to previously abandoned practical considerations. Indeed a main themeof this lecture will be to demonstrate this applied mathematical approach by a number ofchallenging examples
Resumo:
INTRODUCTION Radiotherapy outcomes might be further improved by a greater understanding of the individual variations in normal tissue reactions that determine tolerance. Most published studies on radiation toxicity have been performed retrospectively. Our prospective study was launched in 1996 to measure the in vitro radiosensitivity of peripheral blood lymphocytes before treatment with radical radiotherapy in patients with breast cancer, and to assess the early and the late radiation skin side effects in the same group of patients. We prospectively recruited consecutive breast cancer patients receiving radiation therapy after breast surgery. To evaluate whether early and late side effects of radiotherapy can be predicted by the assay, a study was conducted of the association between the results of in vitro radiosensitivity tests and acute and late adverse radiation effects. METHODS Intrinsic molecular radiosensitivity was measured by using an initial radiation-induced DNA damage assay on lymphocytes obtained from breast cancer patients before radiotherapy. Acute reactions were assessed in 108 of these patients on the last treatment day. Late morbidity was assessed after 7 years of follow-up in some of these patients. The Radiation Therapy Oncology Group (RTOG) morbidity score system was used for both assessments. RESULTS Radiosensitivity values obtained using the in vitro test showed no relation with the acute or late adverse skin reactions observed. There was no evidence of a relation between acute and late normal tissue reactions assessed in the same patients. A positive relation was found between the treatment volume and both early and late side effects. CONCLUSION After radiation treatment, a number of cells containing major changes can have a long survival and disappear very slowly, becoming a chronic focus of immunological system stimulation. This stimulation can produce, in a stochastic manner, late radiation-related adverse effects of varying severity. Further research is warranted to identify the major determinants of normal tissue radiation response to make it possible to individualize treatments and improve the outcome of radiotherapy in cancer patients.
Resumo:
Despite the wealth of information generated by trans-disciplinary research in Chagas disease, knowledge about its multifaceted pathogenesis is still fragmented. Here we review the body of experimental studies in animal models supporting the concept that persistent infection by Trypanosoma cruzi is crucial for the development of chronic myocarditis. Complementing this review, we will make an effort to reconcile seemingly contradictory results concerning the immune profiles of chronic patients from Argentina and Brazil. Finally, we will review the results of molecular studies suggesting that parasite-induced inflammation and tissue damage is, at least in part, mediated by the activities of trans-sialidase, mucin-linked lipid anchors (TLR2 ligand) and cruzipain (a kinin-releasing cysteine protease). One hundred years after the discovery of Chagas disease, it is reassuring that basic and clinical research tends to converge, raising new perspectives for the treatment of chronic Chagas disease.
Resumo:
Purpose Cadaveric study at our institution has demonstrated that optimal basaplate fixation of a reversed shoulder arthroplasty (RSA) could be achieved with screws in three major columns. Our aim was to review our early rate of aseptic glenoid loosening in a series of baseplate fixed according to this principle. Material and Methods Between 2005 and 2008, 48 RSA (Aequalis Reversed) were implanted in 48 patients with an average age of 74.4 years (range, 56 to 86 years). There were 37 women and 11 men. Twenty-seven primary RSAs were performed for cuff tear arthropathy, 3 after failed rotator cuff surgery, 6 for failed arthroplasties, 7 for acute fractures and 5 after failed ORIF. All baseplate fixation were done using a nonlocking posterior screw in the spine, a nonlocking anterior screw in the glenoid body, a locking superior screw in the coracoid and a locking inferior screw in the pillar. All patients were reviewed with standardized radiographs. The number of screws were reported. We measured the position of the screws in relation to the scapular spine and the coracoid process in two different views. We defined screw positions as totally, partially or out of the target. Finally we reported glenoid aseptic loosening which was defined as implant subsidence. Results Four patients were lost to follow-up. Thus, 44 shoulders could be reviewed after a mean follow-up of 13 months (range, 6 to 32 months). All baseplates were fixed with 4 screws. Thirty-seven (84%) screws were either partially or totally in the spine. Thus, 7 (16%) scapular spine screws were out of the target. No coracoid screw was out the target. Two (4.5%) patients had glenoid loosening. Both had a scapular spine and a coracoid screw partially in the bone. Conclusion Early aseptic glenoid loosening occurred before the two years follow-up and is most of time related to technical problems and/or insufficient bone stock and bone quality. Our study demonstrate that baseplate fixation according to the three columns principle is a reproducible technique and a valuable way to prevent early glenoid loosening.
Resumo:
Imaging mass spectrometry (IMS) represents an innovative tool in the cancer research pipeline, which is increasingly being used in clinical and pharmaceutical applications. The unique properties of the technique, especially the amount of data generated, make the handling of data from multiple IMS acquisitions challenging. This work presents a histology-driven IMS approach aiming to identify discriminant lipid signatures from the simultaneous mining of IMS data sets from multiple samples. The feasibility of the developed workflow is evaluated on a set of three human colorectal cancer liver metastasis (CRCLM) tissue sections. Lipid IMS on tissue sections was performed using MALDI-TOF/TOF MS in both negative and positive ionization modes after 1,5-diaminonaphthalene matrix deposition by sublimation. The combination of both positive and negative acquisition results was performed during data mining to simplify the process and interrogate a larger lipidome into a single analysis. To reduce the complexity of the IMS data sets, a sub data set was generated by randomly selecting a fixed number of spectra from a histologically defined region of interest, resulting in a 10-fold data reduction. Principal component analysis confirmed that the molecular selectivity of the regions of interest is maintained after data reduction. Partial least-squares and heat map analyses demonstrated a selective signature of the CRCLM, revealing lipids that are significantly up- and down-regulated in the tumor region. This comprehensive approach is thus of interest for defining disease signatures directly from IMS data sets by the use of combinatory data mining, opening novel routes of investigation for addressing the demands of the clinical setting.
Resumo:
BACKGROUND Pollen is one of the main causes of allergic sensitization. It is not easy to make an etiological diagnosis of pollen-allergic patients because of the wide variety of sensitizing pollens, association with food allergy, and increasing incidence of polysensitization, which may result from the presence of allergens that are common to different species, as is the case of panallergens. OBJECTIVE To compare the results of skin prick tests (SPT) using whole pollen extract with specific immunoglobulin (Ig) E determination for several allergens (purified panallergens included) in the diagnosis of polysensitized pollen-allergic patients. METHODS The study sample comprised 179 pollen-sensitized patients who underwent SPT with pollen extract and allergen-specific IgE determination against different allergens. RESULTS The level of concordance between the traditional diagnostic test (SPT) and IgE determination was low, especially in patients sensitized to the panallergens profilin and polcalcin. In the case of SPT, the results demonstrated that patients who are sensitized to either of these panallergens present a significantly higher number of positive results than patients who are not. However, IgE determination revealed that while patients sensitized to polcalcins are sensitized to allergens from a higher number of pollens than the rest of the sample, this is not the case in patients sensitized to profilins. On the other hand, sensitization to profilin or lipid transfer proteins was clearly associated with food allergy. CONCLUSIONS Sensitization to panallergens could be a confounding factor in the diagnosis of polysensitized pollen-allergic patients as well as a marker for food allergy. However, more studies are required to further investigate the role of these molecules.
Resumo:
With this final master thesis we are going to contribute to the Asterisk open source project. Asterisk is an open source project that started with the main objective of develop an IP telephony platform, completely based on Software (so not hardware dependent) and under an open license like GPL. This project was started on 1999 by the software engineer Mark Spencer at Digium. The main motivation of that open source project was that the telecommunications sector is lack of open solutions, and most of the available solutions are based on proprietary standards, which are close and not compatible between them. Behind the Asterisk project there is a company, Digum, which is the project leading since the project was originated in its laboratories. This company has some of its employees fully dedicated to contribute to the Asterisk project, and also provide the whole infrastructure required by the open source project. But the business of Digium isn't based on licensing of products due to the open source nature of Asterisk, but it's based on offering services around Asteriskand designing and selling some hardware components to be used with Asterisk. The Asterisk project has grown up a lot since its birth, offering in its latest versions advanced functionalities for managing calls and compatibility with some hardware that previously was exclusive of proprietary solutions. Due to that, Asterisk is becoming a serious alternative to all these proprietaries solutions because it has reached a level of maturity that makes it very stable. In addition, as it is open source, it can be fully customized to a givenrequirement, which could be impossible with the proprietaries solutions. Due to the bigness that is reaching the project, every day there are more companies which develop value added software for telephony platforms, that are seriously evaluating the option of make their software fully compatible withAsterisk platforms. All these factors make Asterisk being a consolidated project but in constant evolution, trying to offer all those functionalities offered by proprietaries solutions. This final master thesis will be divided mainly in two blocks totally complementaries. In the first block we will analyze Asterisk as an open source project and Asterisk as a telephony platform (PBX). As a result of this analysis we will generate a document, written in English because it is Asterisk project's official language, which could be used by future contributors as an starting point on joining Asterisk. On the second block we will proceed with a development contribution to the Asterisk project. We will have several options in the form that we do the contribution, such as solving bugs, developing new functionalities or start an Asterisk satellite project. The type of contribution will depend on the needs of the project on that moment.
Resumo:
The aim of this study was to investigate the performance of a new and accurate method for the detection of isoniazid (INH) and rifampicin (RIF) resistance among Mycobacterium tuberculosis isolates using a crystal violet decolourisation assay (CVDA). Fifty-five M. tuberculosis isolates obtained from culture stocks stored at -80ºC were tested. After bacterial inoculation, the samples were incubated at 37ºC for seven days and 100 µL of CV (25 mg/L stock solution) was then added to the control and sample tubes. The tubes were incubated for an additional 24-48 h. CV (blue/purple) was decolourised in the presence of bacterial growth; thus, if CV lost its colour in a sample containing a drug, the tested isolate was reported as resistant. The sensitivity, specificity, positive predictive value, negative predictive value and agreement for INH were 92.5%, 96.4%, 96.1%, 93.1% and 94.5%, respectively, and 88.8%, 100%, 100%, 94.8% and 96.3%, respectively, for RIF. The results were obtained within eight-nine days. This study shows that CVDA is an effective method to detect M. tuberculosis resistance to INH and RIF in developing countries. This method is rapid, simple and inexpensive. Nonetheless, further studies are necessary before routine laboratory implementation.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
The overwhelming predominance of sexual reproduction in nature is surprising given that sex is expected to confer profound costs in terms of production of males and the breakup of beneficial allele combinations. Recognition of these theoretical costs was the inspiration for a large body of empirical research-typically focused on comparing sexual and asexual organisms, lineages, or genomes-dedicated to identifying the advantages and maintenance of sex in natural populations. Despite these efforts, why sex is so common remains unclear. Here, we argue that we can generate general insights into the advantages of sex by taking advantage of parthenogenetic taxa that differ in such characteristics as meiotic versus mitotic offspring production, ploidy level, and single versus multiple and hybrid versus non-hybrid origin. We begin by evaluating benefits that sex can confer via its effects on genetic linkage, diversity, and heterozygosity and outline how the three classes of benefits make different predictions for which type of parthenogenetic lineage would be favored over others. Next, we describe the type of parthenogenetic model system (if any) suitable for testing whether the hypothesized benefit might contribute to the maintenance of sex in natural populations, and suggest groups of organisms that fit the specifications. We conclude by discussing how empirical estimates of characteristics such as time since derivation and number of independent origins of asexual lineages from sexual ancestors, ploidy levels, and patterns of molecular evolution from representatives of these groups can be used to better understand which mechanisms maintain sex in natural populations.