934 resultados para on-time-delivery
Resumo:
Die Entstehung eines Marktpreises für einen Vermögenswert kann als Superposition der einzelnen Aktionen der Marktteilnehmer aufgefasst werden, die damit kumulativ Angebot und Nachfrage erzeugen. Dies ist in der statistischen Physik mit der Entstehung makroskopischer Eigenschaften vergleichbar, die von mikroskopischen Wechselwirkungen zwischen den beteiligten Systemkomponenten hervorgerufen werden. Die Verteilung der Preisänderungen an Finanzmärkten unterscheidet sich deutlich von einer Gaußverteilung. Dies führt zu empirischen Besonderheiten des Preisprozesses, zu denen neben dem Skalierungsverhalten nicht-triviale Korrelationsfunktionen und zeitlich gehäufte Volatilität zählen. In der vorliegenden Arbeit liegt der Fokus auf der Analyse von Finanzmarktzeitreihen und den darin enthaltenen Korrelationen. Es wird ein neues Verfahren zur Quantifizierung von Muster-basierten komplexen Korrelationen einer Zeitreihe entwickelt. Mit dieser Methodik werden signifikante Anzeichen dafür gefunden, dass sich typische Verhaltensmuster von Finanzmarktteilnehmern auf kurzen Zeitskalen manifestieren, dass also die Reaktion auf einen gegebenen Preisverlauf nicht rein zufällig ist, sondern vielmehr ähnliche Preisverläufe auch ähnliche Reaktionen hervorrufen. Ausgehend von der Untersuchung der komplexen Korrelationen in Finanzmarktzeitreihen wird die Frage behandelt, welche Eigenschaften sich beim Wechsel von einem positiven Trend zu einem negativen Trend verändern. Eine empirische Quantifizierung mittels Reskalierung liefert das Resultat, dass unabhängig von der betrachteten Zeitskala neue Preisextrema mit einem Anstieg des Transaktionsvolumens und einer Reduktion der Zeitintervalle zwischen Transaktionen einhergehen. Diese Abhängigkeiten weisen Charakteristika auf, die man auch in anderen komplexen Systemen in der Natur und speziell in physikalischen Systemen vorfindet. Über 9 Größenordnungen in der Zeit sind diese Eigenschaften auch unabhängig vom analysierten Markt - Trends, die nur für Sekunden bestehen, zeigen die gleiche Charakteristik wie Trends auf Zeitskalen von Monaten. Dies eröffnet die Möglichkeit, mehr über Finanzmarktblasen und deren Zusammenbrüche zu lernen, da Trends auf kleinen Zeitskalen viel häufiger auftreten. Zusätzlich wird eine Monte Carlo-basierte Simulation des Finanzmarktes analysiert und erweitert, um die empirischen Eigenschaften zu reproduzieren und Einblicke in deren Ursachen zu erhalten, die zum einen in der Finanzmarktmikrostruktur und andererseits in der Risikoaversion der Handelsteilnehmer zu suchen sind. Für die rechenzeitintensiven Verfahren kann mittels Parallelisierung auf einer Graphikkartenarchitektur eine deutliche Rechenzeitreduktion erreicht werden. Um das weite Spektrum an Einsatzbereichen von Graphikkarten zu aufzuzeigen, wird auch ein Standardmodell der statistischen Physik - das Ising-Modell - auf die Graphikkarte mit signifikanten Laufzeitvorteilen portiert. Teilresultate der Arbeit sind publiziert in [PGPS07, PPS08, Pre11, PVPS09b, PVPS09a, PS09, PS10a, SBF+10, BVP10, Pre10, PS10b, PSS10, SBF+11, PB10].
Resumo:
Hypernuclear physics is currently attracting renewed interest, due tornthe important role of hypernuclei spectroscopy rn(hyperon-hyperon and hyperon-nucleon interactions) rnas a unique toolrnto describe the baryon-baryon interactions in a unified way and to rnunderstand the origin of their short-range.rnrnHypernuclear research will be one of the main topics addressed by the {sc PANDA} experimentrnat the planned Facility for Antiproton and Ion Research {sc FAIR}.rnThanks to the use of stored $overline{p}$ beams, copiousrnproduction of double $Lambda$ hypernuclei is expected at thern{sc PANDA} experiment, which will enable high precision $gamma$rnspectroscopy of such nuclei for the first time.rnAt {sc PANDA} excited states of $Xi^-$ hypernuclei will be usedrnas a basis for the formation of double $Lambda$ hypernuclei.rnFor their detection, a devoted hypernuclear detector setup is planned. This setup consists ofrna primary nuclear target for the production of $Xi^{-}+overline{Xi}$ pairs, a secondary active targetrnfor the hypernuclei formation and the identification of associated decay products and a germanium array detector to perform $gamma$ spectroscopy.rnrnIn the present work, the feasibility of performing high precision $gamma$rnspectroscopy of double $Lambda$ hypernuclei at the {sc PANDA} experiment has been studiedrnby means of a Monte Carlo simulation. For this issue, the designing and simulation of the devoted detector setup as well as of the mechanism to produce double $Lambda$ hypernuclei have been optimizedrntogether with the performance of the whole system. rnIn addition, the production yields of double hypernuclei in excitedrnparticle stable states have been evaluated within a statistical decay model.rnrnA strategy for the unique assignment of various newly observed $gamma$-transitions rnto specific double hypernuclei has been successfully implemented by combining the predicted energy spectra rnof each target with the measurement of two pion momenta from the subsequent weak decays of a double hypernucleus.rn% Indeed, based on these Monte Carlo simulation, the analysis of the statistical decay of $^{13}_{Lambda{}Lambda}$B has been performed. rn% As result, three $gamma$-transitions associated to the double hypernuclei $^{11}_{Lambda{}Lambda}$Bern% and to the single hyperfragments $^{4}_{Lambda}$H and $^{9}_{Lambda}$Be, have been well identified.rnrnFor the background handling a method based on time measurement has also been implemented.rnHowever, the percentage of tagged events related to the production of $Xi^{-}+overline{Xi}$ pairs, variesrnbetween 20% and 30% of the total number of produced events of this type. As a consequence, further considerations have to be made to increase the tagging efficiency by a factor of 2.rnrnThe contribution of the background reactions to the radiation damage on the germanium detectorsrnhas also been studied within the simulation. Additionally, a test to check the degradation of the energyrnresolution of the germanium detectors in the presence of a magnetic field has also been performed.rnNo significant degradation of the energy resolution or in the electronics was observed. A correlationrnbetween rise time and the pulse shape has been used to correct the measured energy. rnrnBased on the present results, one can say that the performance of $gamma$ spectroscopy of double $Lambda$ hypernuclei at the {sc PANDA} experiment seems feasible.rnA further improvement of the statistics is needed for the background rejection studies. Moreover, a more realistic layout of the hypernuclear detectors has been suggested using the results of these studies to accomplish a better balance between the physical and the technical requirements.rn
Resumo:
Die Erzeugung von Elektronenstrahlen hoher Intensität (I$geq$2,mA) und hoher Spinpolarisation (P$geq$85%) ist für die Experimente an den geplanten glqq Linac Ringgrqq Electron--Ion--Collidern (z.B. eRHIC am Brookhaven National Laboratory) unabdingbar, stellt aber zugleich eine enorme Herausforderung dar. Die Photoemission aus ce{GaAs}--basierten Halbleitern wie z.B. den in dieser Arbeit untersuchten GaAlAs/InGaAlAs Quanten--Übergittern zeichnet sich zwar durch eine hohe Brillanz aus, die geringe Quantenausbeute von nur ca. 1% im Bereich maximaler Polarisation erfordert jedoch hohe Laserintensitäten von mehreren Watt pro $text{cm}^{2}$, was erhebliche thermische Probleme verursacht. rnrnIn dieser Arbeit konnte zunächst gezeigt werden, dass die Lebensdauer einer Photokathode mit steigender Laserleistung bzw. Temperatur exponentiell abnimmt. Durch Einbringen eines DBR--Spiegels zwischen die aktive Zone der Photokathode und ihr Substrat wird ein Großteil des ungenutzten Laserlichts wieder aus dem Kristall herausreflektiert und trägt somit nicht zur Erwärmung bei. Gleichzeitig bildet der Spiegel zusammen mit der Grenzfläche zum Vakuum eine Resonator--Struktur aus, die die aktive Zone umschließt. Dadurch kommt es für bestimmte Wellenlängen zu konstruktiver Interferenz und die Absorption in der aktiven Zone erhöht sich. Beide Effekte konnten durch vergleichenden Messungen an Kathoden mit und ohne DBR--Spiegel nachgewiesen werden. Dabei ergibt sich eine gute Übereinstimmung mit der Vorhersage eines Modells, das auf der dielektrischen Funktion der einzelnen Halbleiterstrukturen beruht. Von besonderer praktischer Bedeutung ist, dass die DBR--Kathode für einen gegebenen Photoemissions-strom eine um einen Faktor $geq$,3{,}5 kleinere Erwärmung aufweist. Dies gilt über den gesamten Wellenlängenbereich in dem die Kathode eine hohe Strahlpolarisation (P$>$80%) produzieren kann, auch im Bereich der Resonanz.rnAus zeitaufgelösten Messungen der Ladungsverteilung und Polarisation lassen sich sowohl Rückschlüsse über die Transportmechanismen im Inneren einer Kathode als auch über die Beschaffenheit ihrer Oberfläche ziehen. Im Rahmen dieser Dissertation konnte die Messgeschwindigkeit der verwendeten Apparatur durch den Einbau eines schnelleren Detektors und durch eine Automatisierung der Messprozedur entscheidend vergrößert und die resultierende Zeitauflösung mit jetzt 1{,}2 Pikosekunden annähernd verdoppelt werden.rnrnDie mit diesen Verbesserungen erhaltenen Ergebnisse zeigen, dass sich der Transport der Elektronen in Superlattice--Strukturen stark vom Transport in den bisher untersuchten Bulk--Kristallen unterscheidet. Der Charakter der Bewegung folgt nicht dem Diffusionsmodell, sondern gibt Hinweise auf lokalisierte Zustände, die nahe der Leitungsbandunterkante liegen und Elektronen für kurze Zeit einfangen können. Dadurch hat die Impulsantwort einer Kathode neben einem schnellen Abfall des Signals auch eine größere Zeitkonstante, die selbst nach 30,ps noch ein Signal in der Größenordnung von ca. 5textperthousand der Maximalintensität erzeugt.
Resumo:
Die obere Troposphäre / untere Stratosphäre (UTLS: Upper Troposphere / Lower Stratosphere)ist die Übergangsgregion zwischen den dynamisch, chemisch und mikrophysikalisch sehr verschiedenen untersten Atmosphärenschichten, der Troposphäre und der Stratosphäre. Strahlungsaktive Spurengase, wie zum Beispiel Wasserdampf (H2O), Ozon (O3) oder Kohlenstoffdioxid (CO2), und Wolken in der UTLS beeinflussen das Strahlungsbudget der Atmosphäre und das globale Klima. Mögliche Veränderungen in den Verteilungen und Konzentrationen dieser Spurengase modifizieren den Strahlungsantrieb der Atmosphäre und können zum beobachteten Klimawandel beitragen. Ziel dieser Arbeit ist es, Austausch- und Mischungsprozesse innerhalb der UTLS besser zu verstehen und damit Veränderungen der Spurengaszusammensetzung dieser Region genauer prognostizieren zu können. Grundlage hierfür bilden flugzeuggetragene in-situ Spurengasmessungen in der UTLS, welche während der Flugzeugmesskampagnen TACTS / ESMVal 2012 und AIRTOSS - ICE 2013 durchgeführt wurden. Hierbei wurde bei den Messungen von AIRTOSS - ICE 2013 das im Rahmen dieser Arbeit aufgebaute UMAQS (University of Mainz Airborne QCLbased Spectrometer) - Instrument zur Messung der troposphärischen Spurengase Distickstoffmonoxid (N2O) und Kohlenstoffmonoxid (CO) eingesetzt. Dieses erreicht bei einer zeitlichen Auflösung von 1 s eine Messunsicherheit von 0,39 ppbv und 1,39 ppbv der N2O bzw. CO-Mischungsverhältnisse. Die hohe Zeitauflösung und Messgenauigkeit der N2O- und CO- Daten erlaubt die Untersuchung von kleinskaligen Austauschprozessen zwischen Troposphäre und Stratosphäre im Bereich der Tropopause auf räumlichen Skalen kleiner 200 m. Anhand der N2O-Daten von AIRTOSS - ICE 2013 können in-situ detektierte Zirruspartikel in eisübersättigter Luft oberhalb der N2O-basierten chemischen Tropopause nachgewiesen werden. Mit Hilfe der N2O-CO-Korrelation sowie der Analyse von ECMWF-Modelldaten und der Berechnung von Rückwärtstrajektorien kann deren Existenz auf das irreversible Vermischen von troposphärischen und stratosphärischen Luftmassen zurückgeführt werden. Mit den in-situ Messungen von N2O, CO und CH4 (Methan) von TACTS und ESMVal 2012 werden die großräumigen Spurengasverteilungen bis zu einer potentiellen Temperatur von Theta = 410 K in der extratropischen Stratosphäre untersucht. Hierbei kann eine Verjüngung der Luftmassen in der extratropischen Stratosphäre mit Delta Theta > 30 K (relativ zur dynamischen Tropopause) über den Zeitraum der Messkampagne (28.08.2012 - 27.09.2012) nachgewiesen werden. Die Korrelation von N2O mit O3 zeigt, dass diese Verjüngung aufgrund des verstärkten Eintrages von Luftmassen aus der tropischen unteren Stratosphäre verursacht wird. Diese werden über den flachen Zweig der Brewer-Dobson-Zirkulation auf Zeitskalen von wenigen Wochen in die extratropische Stratosphäre transportiert. Anhandrnder Analyse der CO-O3-Korrelation eines Messfluges vom 30.08.2012 wird das irreversible Einmischen von Luftmassen aus der tropischen Stratosphäre in die Extratropen auf Isentropen mit Theta > 380 K identifiziert. Rückwärtstrajektorien zeigen, dass der Ursprung der eingemischten tropischen Luftmassen im Bereich der sommerlichen Antizyklone des asiatischen Monsuns liegt.
Resumo:
Phosphatidylethanol (PEth) is an abnormal phospholipid carrying two fatty acid chains. It is only formed in the presence of ethanol via the action of phospholipase D (PLD). Its use as a biomarker for alcohol consumption is currently under investigation. Previous methods for the analysis of PEth included high-performance liquid chromatography (HPLC) coupled to an evaporative light scattering detector (ELSD), which is unspecific for the different homologues--improved methods are now based on time of flight mass spectrometry (TOF-MS) and tandem mass spectrometry (MS/MS). The intention of this work was to identify as many homologues of PEth as possible. A screening procedure using multiple-reaction monitoring (MRM) for the identified homologues has subsequently been established. For our investigations, autopsy blood samples collected from heavy drinkers were used. Phosphatidylpropanol 16:0/18:1 (internal standard) was added to the blood samples prior to liquid-liquid extraction using borate buffer (pH 9), 2-propanol and n-hexane. After evaporation, the samples were redissolved in the mobile phase and injected into the LC-MS/MS system. Compounds were separated on a Luna Phenyl Hexyl column (50 mm x 2 mm, 3 microm) by gradient elution, using 2 mM ammonium acetate and methanol/acetone (95/5; v/v). A total of 48 homologues of PEth could be identified by using precursor ion and enhanced product ion scans (EPI).
Resumo:
We study a homogeneously driven granular fluid of hard spheres at intermediate volume fractions and focus on time-delayed correlation functions in the stationary state. Inelastic collisions are modeled by incomplete normal restitution, allowing for efficient simulations with an event-driven algorithm. The incoherent scattering function Fincoh(q,t ) is seen to follow time-density superposition with a relaxation time that increases significantly as the volume fraction increases. The statistics of particle displacements is approximately Gaussian. For the coherent scattering function S(q,ω), we compare our results to the predictions of generalized fluctuating hydrodynamics, which takes into account that temperature fluctuations decay either diffusively or with a finite relaxation rate, depending on wave number and inelasticity. For sufficiently small wave number q we observe sound waves in the coherent scattering function S(q,ω) and the longitudinal current correlation function Cl(q,ω). We determine the speed of sound and the transport coefficients and compare them to the results of kinetic theory.
Resumo:
We report molybdenum isotope compositions and concentrations in water samples from a variety of river catchment profiles in order to investigate the influence of anthropogenic contamination, catchment geology, within-river precipitation, and seasonal river flow variations on riverine molybdenum. Our results show that the observed variations in δ98/95Mo from 0‰ to 1.9‰ are primarily controlled by catchment lithology, particularly by weathering of sulfates and sulfides. Erosion in catchments dominated by wet-based glaciers leads to very high dissolved molybdenum concentrations. In contrast, anthropogenic inputs affect neither the concentration nor the isotopic composition of dissolved molybdenum in the rivers studied here. Seasonal variations are also quite muted. The finding that catchment geology exerts the primary control on the delivery of molybdenum to seawater indicates that the flux and isotope composition of molybdenum to seawater has likely varied in the geologic past.
Resumo:
Background Minor protease inhibitor (PI) mutations often exist as polymorphisms in HIV-1 sequences from treatment-naïve patients. Previous studies showed that their presence impairs the antiretroviral treatment (ART) response. Evaluating these findings in a larger cohort is essential. Methods To study the impact of minor PI mutations on time to viral suppression and time to virological failure, we included patients from the Swiss HIV Cohort Study infected with HIV-1 subtype B who started first-line ART with a PI and two nucleoside reverse transcriptase inhibitors. Cox regression models were performed to compare the outcomes among patients with 0 and ≥1 minor PI mutation. Models were adjusted for baseline HIV-1 RNA, CD4 cell count, sex, transmission category, age, ethnicity, year of ART start, the presence of nucleoside reverse transcriptase inhibitor mutations, and stratified for the administered PIs. Results We included 1199 patients of whom 944 (78.7%) received a boosted PI. Minor PI mutations associated with the administered PI were common: 41.7%, 16.1%, 4.7% and 1.9% had 1, 2, 3 or ≥4 mutations, respectively. The time to viral suppression was similar between patients with 0 (reference) and ≥1 minor PI mutation (multivariable hazard ratio (HR): 1.1 [95% confidence interval (CI): 1.0–1.3], P = .196). The time to virological failure was also similar (multivariable HR:.9 [95% CI:.5–1.6], P = .765). In addition, the impact of each single minor PI mutation was analyzed separately: none was significantly associated with the treatment outcome. Conclusions The presence of minor PI mutations at baseline has no effect on the therapy outcome in HIV infected individuals.
Resumo:
Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders.
Resumo:
Since September 2000, when world leaders agreed on time-bound, measurable goals to reduce extreme poverty, hunger, illiteracy, and disease while fostering gender equality and ensuring environmental sustainability, the Millennium Development Goals (MDGs) have increasingly come to dominate the policy objectives of many states and development agencies. The concern has been raised that the tight timeframe and financial restrictions might force governments to invest in the more productive sectors, thus compromising the quality and sustainability of development efforts. In the long term, this may lead to even greater inequality, especially between geographical regions and social strata. Hence people living in marginal areas, for example in remote mountain regions, and minority peoples risk being disadvantaged by this internationally agreed agenda. Strategies to overcome hunger and poverty in their different dimensions in mountain areas need to focus on strengthening the economy of small-scale farmers, while also fostering the sustainable use of natural resources, taking into consideration their multifunctionality.
Resumo:
Measuring antibiotic-induced killing relies on time-consuming biological tests. The firefly luciferase gene (luc) was successfully used as a reporter gene to assess antibiotic efficacy rapidly in slow-growing Mycobacterium tuberculosis. We tested whether luc expression could also provide a rapid evaluation of bactericidal drugs in Streptococcus gordonii. The suicide vectors pFW5luc and a modified version of pJDC9 carrying a promoterless luc gene were used to construct transcriptional-fusion mutants. One mutant susceptible to penicillin-induced killing (LMI2) and three penicillin-tolerant derivatives (LMI103, LMI104, and LMI105) producing luciferase under independent streptococcal promoters were tested. The correlation between antibiotic-induced killing and luminescence was determined with mechanistically unrelated drugs. Chloramphenicol (20 times the MIC) inhibited bacterial growth. In parallel, luciferase stopped increasing and remained stable, as determined by luminescence and Western blots. Ciprofloxacin (200 times the MIC) rapidly killed 1.5 log10 CFU/ml in 2-4 hr. Luminescence decreased simultaneously by 10-fold. In contrast, penicillin (200 times the MIC) gave discordant results. Although killing was slow (< or = 0.5 log10 CFU/ml in 2 hr), luminescence dropped abruptly by 50-100-times in the same time. Inactivating penicillin with penicillinase restored luminescence, irrespective of viable counts. This was not due to altered luciferase expression or stability, suggesting some kind of post-translational modification. Luciferase shares homology with aminoacyl-tRNA synthetase and acyl-CoA ligase, which might be regulated by macromolecule synthesis and hence affected in penicillin-inhibited cells. Because of resemblance, luciferase might be down-regulated simultaneously. Luminescence cannot be universally used to predict antibiotic-induced killing. Thus, introducing reporter enzymes sharing mechanistic similarities with normal metabolic reactions might reveal other effects than those expected.
Resumo:
Nasal septal hematoma with abscess (NSHA) is an uncommon complication of trauma and studies on children are especially rare. We discuss the case of a 6-year-old girl, who was initially evaluated independently by three doctors for minor nasal trauma but had to be re-hospitalized 6 days later with NSHA. Although septal hematoma had initially been excluded (5, 7 and 24 hours after trauma), a secondary accumulation of blood seems to have occured. Delayed hematoma formation has been described in the orbit as a result of possible venous injuries after endoscopic sinus surgery. However, such an observation is new for septal hematoma in children. Thus, we recommend re-evaluation for septal hematoma 48h to 72h after paediatric nasal trauma. Such a scheduled re-examination offers a chance to treat delayed subperichondral hematoma on time before almost inevitable superinfection leads to abscess formation and destruction of the nasal infrastructure. We suggest that parents should be vigilant for delayed nasal obstruction as possible herald of hematoma accumulation within the first week.
Resumo:
Tracking or target localization is used in a wide range of important tasks from knowing when your flight will arrive to ensuring your mail is received on time. Tracking provides the location of resources enabling solutions to complex logistical problems. Wireless Sensor Networks (WSN) create new opportunities when applied to tracking, such as more flexible deployment and real-time information. When radar is used as the sensing element in a tracking WSN better results can be obtained; because radar has a comparatively larger range both in distance and angle to other sensors commonly used in WSNs. This allows for less nodes deployed covering larger areas, saving money. In this report I implement a tracking WSN platform similar to what was developed by Lim, Wang, and Terzis. This consists of several sensor nodes each with a radar, a sink node connected to a host PC, and a Matlab© program to fuse sensor data. I have re-implemented their experiment with my WSN platform for tracking a non-cooperative target to verify their results and also run simulations to compare. The results of these tests are discussed and some future improvements are proposed.
Resumo:
To evaluate strategies used to select cases and controls and how reported odds ratios are interpreted, the authors examined 150 case-control studies published in leading general medicine, epidemiology, and clinical specialist journals from 2001 to 2007. Most of the studies (125/150; 83%) were based on incident cases; among these, the source population was mostly dynamic (102/125; 82%). A minority (23/125; 18%) sampled from a fixed cohort. Among studies with incident cases, 105 (84%) could interpret the odds ratio as a rate ratio. Fifty-seven (46% of 125) required the source population to be stable for such interpretation, while the remaining 48 (38% of 125) did not need any assumptions because of matching on time or concurrent sampling. Another 17 (14% of 125) studies with incident cases could interpret the odds ratio as a risk ratio, with 16 of them requiring the rare disease assumption for this interpretation. The rare disease assumption was discussed in 4 studies but was not relevant to any of them. No investigators mentioned the need for a stable population. The authors conclude that in current case-control research, a stable exposure distribution is much more frequently needed to interpret odds ratios than the rare disease assumption. At present, investigators conducting case-control studies rarely discuss what their odds ratios estimate.