25 resultados para Bayes-Laplace binomial intervals


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Glucocorticoid therapy is used worldwide to treat various inflammatory and immune conditions, including inflammatory bowel disease (IBD). In IBD, 80% of the patients obtain a positive response to the therapy; however the development of glucocorticoid-related side-effects is common. Our aim was therefore to study the possibility of optimizing glucocorticoid therapy in children and adolescents with IBD by measuring circulating glucocorticoid bioactivity (GBA) and serum glucocorticoid-responsive biomarkers in patients receiving steroid treatment for active disease. Methods: A total of sixty-nine paediatric IBD patients from the Paediatric Outpatient Clinics of the University Hospitals of Helsinki and Tampere participated in the studies. Control patients included 101 non-IBD patients and 41 disease controls in remission. In patients with active disease, blood samples were withdrawn before the glucocorticoid therapy was started, at 2-4 weeks after the initiation of the steroid and at 1-month intervals thereafter. Clinical response to glucocorticoid treatment and the development of steroid adverse events was carefully registered. GBA was analyzed with a COS-1 cell bioassay. The measured glucocorticoid therapy-responsive biomarkers included adipocyte-derived adiponectin and leptin, bone turnover-related collagen markers amino-terminal type I procollagen propeptide (PINP) and carboxyterminal telopeptide of type I collagen (ICTP) as well as insulin-like growth factor 1 (IGF-1) and sex hormone-binding globulin (SHBG), and inflammatory marker high-sensitivity C-reactive protein (hs-CRP). Results: The most promising marker for glucocorticoid sensitivity was serum adiponectin that associated with steroid therapy–related adverse events. Serum leptin indicated a similar trend. In contrast, circulating GBA rose in all subjects receiving glucocorticoid treatment but did not associate with the clinical response to steroids or with glucocorticoid therapy-related side-effects. Of notice, young patients (<10 years) showed similar GBA levels than older patients, despite receiving higher weight-adjusted doses of glucocorticoid. Markers of bone formation were lower in children with active IBD than in the control patients, probably reflecting the suppressive effect of the active inflammation. The onset of the glucocorticoid therapy further suppressed bone turnover. Inflammatory marker hs-CRP decreased readily after the initiation of the steroid, however the decrease did not associate with the clinical response to glucocorticoids. Conclusions: This is the first study to show that adipocyte-derived adiponectin associates with steroid therapy-induced side-effects. Further studies are needed, but it is possible that the adiponectin measurement could aid the recognition of glucocorticoid-sensitive patients in the future. GBA and the other markers reflecting glucocorticoid activity in different tissues changed during the treatment, however their change did not correlate with the therapeutic response to steroids or with the development of glucocorticoid-related side effects and therefore cannot guide the therapy in these patients. Studies such as as the present one that combine clinical data with newly developed biomolecular technology are needed to step-by-step build a general picture of the glucocorticoid actions in different tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An efficient and statistically robust solution for the identification of asteroids among numerous sets of astrometry is presented. In particular, numerical methods have been developed for the short-term identification of asteroids at discovery, and for the long-term identification of scarcely observed asteroids over apparitions, a task which has been lacking a robust method until now. The methods are based on the solid foundation of statistical orbital inversion properly taking into account the observational uncertainties, which allows for the detection of practically all correct identifications. Through the use of dimensionality-reduction techniques and efficient data structures, the exact methods have a loglinear, that is, O(nlog(n)), computational complexity, where n is the number of included observation sets. The methods developed are thus suitable for future large-scale surveys which anticipate a substantial increase in the astrometric data rate. Due to the discontinuous nature of asteroid astrometry, separate sets of astrometry must be linked to a common asteroid from the very first discovery detections onwards. The reason for the discontinuity in the observed positions is the rotation of the observer with the Earth as well as the motion of the asteroid and the observer about the Sun. Therefore, the aim of identification is to find a set of orbital elements that reproduce the observed positions with residuals similar to the inevitable observational uncertainty. Unless the astrometric observation sets are linked, the corresponding asteroid is eventually lost as the uncertainty of the predicted positions grows too large to allow successful follow-up. Whereas the presented identification theory and the numerical comparison algorithm are generally applicable, that is, also in fields other than astronomy (e.g., in the identification of space debris), the numerical methods developed for asteroid identification can immediately be applied to all objects on heliocentric orbits with negligible effects due to non-gravitational forces in the time frame of the analysis. The methods developed have been successfully applied to various identification problems. Simulations have shown that the methods developed are able to find virtually all correct linkages despite challenges such as numerous scarce observation sets, astrometric uncertainty, numerous objects confined to a limited region on the celestial sphere, long linking intervals, and substantial parallaxes. Tens of previously unknown main-belt asteroids have been identified with the short-term method in a preliminary study to locate asteroids among numerous unidentified sets of single-night astrometry of moving objects, and scarce astrometry obtained nearly simultaneously with Earth-based and space-based telescopes has been successfully linked despite a substantial parallax. Using the long-term method, thousands of realistic 3-linkages typically spanning several apparitions have so far been found among designated observation sets each spanning less than 48 hours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To a large extent, lakes can be described with a one-dimensional approach, as their main features can be characterized by the vertical temperature profile of the water. The development of the profiles during the year follows the seasonal climate variations. Depending on conditions, lakes become stratified during the warm summer. After cooling, overturn occurs, water cools and an ice cover forms. Typically, water is inversely stratified under the ice, and another overturn occurs in spring after the ice has melted. Features of this circulation have been used in studies to distinguish between lakes in different areas, as basis for observation systems and even as climate indicators. Numerical models can be used to calculate temperature in the lake, on the basis of the meteorological input at the surface. The simple form is to solve the surface temperature. The depth of the lake affects heat transfer, together with other morphological features, the shape and size of the lake. Also the surrounding landscape affects the formation of the meteorological fields over the lake and the energy input. For small lakes the shading by the shores affects both over the lake and inside the water body bringing limitations for the one-dimensional approach. A two-layer model gives an approximation for the basic stratification in the lake. A turbulence model can simulate vertical temperature profile in a more detailed way. If the shape of the temperature profile is very abrupt, vertical transfer is hindered, having many important consequences for lake biology. One-dimensional modelling approach was successfully studied comparing a one-layer model, a two-layer model and a turbulence model. The turbulence model was applied to lakes with different sizes, shapes and locations. Lake models need data from the lakes for model adjustment. The use of the meteorological input data on different scales was analysed, ranging from momentary turbulent changes over the lake to the use of the synoptical data with three hour intervals. Data over about 100 past years were used on the mesoscale at the range of about 100 km and climate change scenarios for future changes. Increasing air temperature typically increases water temperature in epilimnion and decreases ice cover. Lake ice data were used for modelling different kinds of lakes. They were also analyzed statistically in global context. The results were also compared with results of a hydrological watershed model and data from very small lakes for seasonal development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this research is to draw up a clear construction of an anticipatory communicative decision-making process and a successful implementation of a Bayesian application that can be used as an anticipatory communicative decision-making support system. This study is a decision-oriented and constructive research project, and it includes examples of simulated situations. As a basis for further methodological discussion about different approaches to management research, in this research, a decision-oriented approach is used, which is based on mathematics and logic, and it is intended to develop problem solving methods. The approach is theoretical and characteristic of normative management science research. Also, the approach of this study is constructive. An essential part of the constructive approach is to tie the problem to its solution with theoretical knowledge. Firstly, the basic definitions and behaviours of an anticipatory management and managerial communication are provided. These descriptions include discussions of the research environment and formed management processes. These issues define and explain the background to further research. Secondly, it is processed to managerial communication and anticipatory decision-making based on preparation, problem solution, and solution search, which are also related to risk management analysis. After that, a solution to the decision-making support application is formed, using four different Bayesian methods, as follows: the Bayesian network, the influence diagram, the qualitative probabilistic network, and the time critical dynamic network. The purpose of the discussion is not to discuss different theories but to explain the theories which are being implemented. Finally, an application of Bayesian networks to the research problem is presented. The usefulness of the prepared model in examining a problem and the represented results of research is shown. The theoretical contribution includes definitions and a model of anticipatory decision-making. The main theoretical contribution of this study has been to develop a process for anticipatory decision-making that includes management with communication, problem-solving, and the improvement of knowledge. The practical contribution includes a Bayesian Decision Support Model, which is based on Bayesian influenced diagrams. The main contributions of this research are two developed processes, one for anticipatory decision-making, and the other to produce a model of a Bayesian network for anticipatory decision-making. In summary, this research contributes to decision-making support by being one of the few publicly available academic descriptions of the anticipatory decision support system, by representing a Bayesian model that is grounded on firm theoretical discussion, by publishing algorithms suitable for decision-making support, and by defining the idea of anticipatory decision-making for a parallel version. Finally, according to the results of research, an analysis of anticipatory management for planned decision-making is presented, which is based on observation of environment, analysis of weak signals, and alternatives to creative problem solving and communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Suurin ongelma syöpätautien lääkehoidossa on sen aiheuttamat toksiset sivuvaikutukset. Tyypillisesti vain noin 1 % elimistöön annostellusta lääkeaineesta saavuttaa hoitoa tarvitsevat syöpäsolut, loppuosa lääkeaineesta jää vahingoittamaan elimistön terveitä soluja. Toksiset sivuvaikutukset rajoittavat lääkehoidon annoksen nostamista elimistössä riittävälle pitoisuudelle, mikä johtaa usein sairauden ennenaikaiseen pahenemiseen ja mahdollisen lääkeaineresistenssin kehittymiseen. Liposomien välittämä lääkeaineen kohdentaminen voidaan jakaa kahteen eri menetelmään: passiiviseen ja aktiiviseen kohdentamiseen. Liposomien passiivisen kohdentamisen tarkoituksena on lisätä sytotoksisen lääkeaineen paikallistumista pelkästään kasvainkudokseen. Passiivinen kohdentaminen perustuu liposomien kulkeutumiseen verenkierron mukana, jolloin liposomit kerääntyvät epänormaalisti muodostuneeseen kasvainkudokseen. Liposomien aktiivisella kohdentamisella pyritään parantamaan passiivisesti kohdentuvien liposomien terapeuttista tehokkuutta kohdentamalla lääkeaineen vaikutus pelkästään syöpäsoluihin. Aktiivisessa kohdennuksessa liposomin pintaan kiinnitetään ligandi, joka spesifisesti tunnistaa kohdesolun. Tämän pro gradu -tutkielman kirjallisen osion tarkoituksena oli tutustua syöpäkudokseen kohdennettujen liposomien ominaisuuksiin tehokkaan soluunoton ja sytotoksisuuden saavuttamiseksi. Kokeellisessa osiossa tutkittiin kohdennettujen liposomien soluunottoa ja sytotoksista vaikutusta ihmisen munasarjasta eristetyillä adenokarsinoomasoluilla (SKOV-3). Liposomit kohdennettiin setuksimabi (C225, Erbitux®) vasta-aineella, jonka on todettu olevan tietyissä syöpätyypeissä (mm. keuhko- ja kolorektaalisyövissä, pään ja kaulan syövissä sekä rinta-, munuais-, eturauhas-, haima- ja munasarjasyövissä) yli-ilmentyneen epidermaalisen kasvutekijäreseptoriperheen HER1-proteiinin (ErbB-1, EGFR, epidermal growth factor receptor) spesifinen ja selektiivinen inhibiittori. Afrikan viherapinan munuaisista lähtöisin olevaa CV-1 solulinjaa käytettiin kontrollina kuvaamaan elimistön normaaleja soluja. Kohdennettujen liposomien soluunottoa tutkittiin soluunottokokeilla, joissa käytettiin kontrollina kohdentamattomia pegyloituja liposomeja. Setuksimabi-vasta-aineen spesifinen sitoutuminen EGF-reseptoriin todettiin kilpailutuskokeilla. Doksorubisiinia sisältävien immunoliposomien sytotoksisuutta selvitettiin Alamar Blue™ -elävyystestillä. Lisäksi immunoliposomien säilyvyyttä seurattiin mittaamalla liposomien keskimääräinen halkaisija noin kahden viikon välein. Setuksimabi-vasta-aineella kohdennettujen liposomien soluunotto oli huomattavasti suurentunut SKOV-3 syöpäsoluissa ja doksorubisiinia sisältävät kohdennetut liposomit aiheuttivat voimakkaamman sytotoksisen vaikutuksen kuin kohdentamattomat liposomit. Kohdennettujen doksorubisiiniliposomien sytotoksisuus tuli kuitenkin esille viiveellä, mikä viittaa lääkeaineen hitaaseen vapautumiseen liposomista. Suurentunutta soluunottoa ja sytotoksista vaikutusta ei havaittu CV-1 solulinjassa. Kohdennettujen liposomien sovellusmahdollisuudet lääketieteessä ja syövän hoidossa ovat merkittävät. Tällä hetkellä liposomien kliininen käyttö rajoittuu passiivisesti kohdennettuihin liposomeihin (Doxil® (Am.),Caelyx® (Eur.)). Lupaavista solukokeista huolimatta kohdennettujen liposomien terapeuttinen käyttö tulevaisuudessa näyttää haasteelliselta.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT The diocese as the agent and advocate of diaconial work. The development of diaconial work in the Mikkeli diocese 1945–1991. The roots of Finnish diacony are in the individual devotional life of Pietism. An acting faith had to be evident in acts of love. Following German institutional diacony, diaconial institutions were established in Finland until congregational diacony emerged alongside these institutions in the 1890s. Pastor Otto Aarnisalo acted as a pathfinder in this. He aimed to unite diacony with the Church and the life of the congregation. Diacony had been based on the idea of volunteering to separate it from statutory social work. In 1944 the church law was amended, which made diacony the concern of every member of the congregation. In the years immediately following the Second World War, discussion took place in the Church of Finland about the direction that diacony should take. In the consequential debate, caritative services overcame social diacony. The diocese administration moved to Mikkeli in 1945, when the majority of the Vyborg diocese became part of the USSR in the armistice negotiations. The Mikkeli diocese acted in its diaconial work with the same objectives as the diaconial solutions of the whole church. The acting principle of the diocese diacony became a form of helping which emphasised assistance of the individual. Especially from the 1960s onwards, the country's industrialisation and the reduction of agricultural trade had an effect on the Mikkeli diocese. The diocese administration, specifically Bishop Martti Simojoki and his successor Osmo Alaja, aimed to open up connections to the political left and people working in industry. At least indirectly this helped the diaconial work in industrial localities. In the Mikkeli diocese, a diaconial committee was established in 1971, and its work was overseen by the diocesan chapter of the bishop's office. This enabled the work of the diocese to be organised for the different areas of diacony. Previously, the diaconial work of the Finnish church had primarily been in nursing. The Health Insurance Law of 1972 brought a change to this when the responsibility for health services was transferred to the municipalities. Diacony began to move towards a psychological and spiritual emphasis. Beginning in the 1970s, the diocese started holding diaconial themed days at prescribed intervals. Although these did not result in great realignments, they did help clarify the direction that diacony would take. Large international collections were also carried out, especially in the 1980s. At the same time, socio-ethical activity vitalised and diversified Christian services. The idea that every member of the congregation should practice diacony was a strong factor in the Mikkeli diocese as well. The diocese's vision for diacony was holistic; Christian service was the responsibility of every member of the congregation. During the period of study (1945–1991), the theology of diacony was rather tenuous. Bishop Kalevi Toiviainen, however, brought forth the viewpoint of church doctrine and officially sanctioned theology. Diacony was part of the complete faith of the Church.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the intraday and weekend volatility on the German DAX. The intraday volatility is partitioned into smaller intervals and compared to a whole day’s volatility. The estimated intraday variance is U-shaped and the weekend variance is estimated to 19 % of a normal trading day. The patterns in the intraday and weekend volatility are used to develop an extension to the Black and Scholes formula to form a new time basis. Calendar or trading days are commonly used for measuring time in option pricing. The Continuous Time using Discrete Approximations model (CTDA) developed in this study uses a measure of time with smaller intervals, approaching continuous time. The model presented accounts for the lapse of time during trading only. Arbitrage pricing suggests that the option price equals the expected cost of hedging volatility during the option’s remaining life. In this model, time is allowed to lapse as volatility occurs on an intraday basis. The measure of time is modified in CTDA to correct for the non-constant volatility and to account for the patterns in volatility.