958 resultados para Open Data-bank
Resumo:
Osteoarticular allograft is one possible treatment in wide surgical resections with large defects. Performing best osteoarticular allograft selection is of great relevance for optimal exploitation of the bone databank, good surgery outcome and patient’s recovery. Current approaches are, however, very time consuming hindering these points in practice. We present a validation study of a software able to perform automatic bone measurements used to automatically assess the distal femur sizes across a databank. 170 distal femur surfaces were reconstructed from CT data and measured manually using a size measure protocol taking into account the transepicondyler distance (A), anterior-posterior distance in medial condyle (B) and anterior-posterior distance in lateral condyle (C). Intra- and inter-observer studies were conducted and regarded as ground truth measurements. Manual and automatic measures were compared. For the automatic measurements, the correlation coefficients between observer one and automatic method, were of 0.99 for A measure and 0.96 for B and C measures. The average time needed to perform the measurements was of 16 h for both manual measurements, and of 3 min for the automatic method. Results demonstrate the high reliability and, most importantly, high repeatability of the proposed approach, and considerable speed-up on the planning.
Resumo:
In the middle of the twentieth century, banks changed from ‘closed’ designs signifying wealth, security, and safety to ‘open’ designs signifying hospitality, honesty, and transparency as the perception of money changed from a passive physical substance to be slowly accumulated to an active notational substance to be kept in motion. If money is saved, customers must trust that the bank is secure and their money will be there when they want it; if money is invested, customers must trust that it is being done openly and honestly and they are being well-advised. Architecture visually communicates that the institution can be trusted in the requisite way.
Resumo:
Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.
Resumo:
Increasing evidence suggest that the long "untranslated" region (UTR) between the matrix (M) and the fusion (F) proteins of morbilliviruses has a functional role. In canine distemper virus (CDV), the F 5' UTR was recently shown to code for a long F signal peptide (Fsp). Subsequently, it was reported that the M/F UTRs combined with the long Fsp were synergistically regulating the F mRNA and protein expression, thereby modulating virulence. Unique to CDV, a short putative open reading frame (ORF) has been identified within the wild-type CDV-M 3' UTR (termed M2). Here, we investigated whether M2 was expressed from the genome of the virulent and demyelinating A75/17-CDV strain. An expression plasmid encoding the M2 ORF tagged both at its N-terminal (HA) and C-terminal domains (RFP), was first constructed. Then, a recombinant virus with its putative M2 ORF replaced by HA-M2-RFP was successfully recovered from cDNA (termed recA75/17(green)-HA-M2-RFP). M2 expression in cells transfected or infected with these mutants was studied by immunoprecipitation, immunofluorescence, immunoblot and flow cytometry analyses. Although fluorescence was readily detected in HA-M2-RFP-transfected cells, absence of red fluorescence emission in several recA75/17(green)-HA-M2-RFP-infected cell types suggested lack of M2 biosynthesis, which was confirmed by the other techniques. Consistent with these data, no functional role of the short polypeptide was revealed by infecting various cell types with HA-M2-RFP over-expressing or M2-knockout recombinant viruses. Thus, in sharp contrast to the CDV-F 5' UTR reported to translate a long Fsp, our data provided evidence that the CDV-M 3' UTR does not express any polypeptides.
Resumo:
OBJECTIVE: The study was conducted to determine activation of coagulation in patients undergoing open and endovascular infrarenal abdominal aortic aneurysm repair (EVAR). METHODS: In a prospective, comparative study, 30 consecutive patients undergoing open repair (n = 15) or EVAR (n = 15) were investigated. Blood samples to determine fibrinopeptide A, fibrin monomer, thrombin-antithrombin complex, and D-dimer were taken up to 5 days postoperatively. Routine hematologic and hematochemical parameters as well as clinical data were collected. RESULTS: Both groups showed comparable demographic variables. Operating time was longer in open repair (249 +/- 77 minutes vs 186 +/- 69 minutes, P < .05). Perioperatively elevated markers of coagulation were measured in both groups. Fibrinopeptide A levels did not differ significantly between the groups (P = .55). The levels of fibrin monomer and thrombin-antithrombin complex were significantly higher in patients undergoing EVAR (P < .0001), reflecting increased thrombin activity and thrombin formation compared with open surgery. The D-dimer level did not differ significantly between the groups. These results were also valid after correction for hemodilution. CONCLUSION: These data suggest increased procoagulant activity in EVAR compared with open surgery. A procoagulant state may favor possible morbidity derived from micro- and macrovascular thrombosis, such as in myocardial infarction, multiple organ dysfunction, venous thrombosis and thromboembolism, or disseminated intravascular coagulation.
Resumo:
BACKGROUND: Treatment of patients with attention deficit hyperactivity disorder (ADHD) with homeopathy is difficult. The Swiss randomised, placebo controlled, cross-over trial in ADHD patients (Swiss ADHD trial) was designed with an open-label screening phase prior to the randomised controlled phase. During the screening phase, the response of each child to successive homeopathic medications was observed until the optimal medication was identified. Only children who reached a predefined level of improvement participated in the randomised, cross-over phase. Although the randomised phase revealed a significant beneficial effect of homeopathy, the cross-over caused a strong carryover effect diminishing the apparent difference between placebo and verum treatment. METHODS: This retrospective analysis explores the screening phase data with respect to the risk of failure to demonstrate a specific effect of a randomised controlled trial (RCT) with randomisation at the start of the treatment. RESULTS: During the screening phase, 84% (70/83) of the children responded to treatment and reached eligibility for the randomised trial after a median time of 5 months (range 1-18), with a median of 3 different medications (range 1-9). Thirteen children (16%) did not reach eligibility. Five months after treatment start, the difference in Conners Global Index (CGI) rating between responders and non-responders became highly significant (p = 0.0006). Improvement in CGI was much greater following the identification of the optimal medication than in the preceding suboptimal treatment period (p < 0.0001). CONCLUSIONS: Because of the necessity of identifying an optimal medication before response to treatment can be expected, randomisation at the start of treatment in an RCT of homeopathy in ADHD children has a high risk of failure to demonstrate a specific treatment effect, if the observation time is shorter than 12 months.
Resumo:
OBJECT: In this study, 1H magnetic resonance (MR) spectroscopy was prospectively tested as a reliable method for presurgical grading of neuroepithelial brain tumors. METHODS: Using a database of tumor spectra obtained in patients with histologically confirmed diagnoses, 94 consecutive untreated patients were studied using single-voxel 1H spectroscopy (point-resolved spectroscopy; TE 135 msec, TE 135 msec, TR 1500 msec). A total of 90 tumor spectra obtained in patients with diagnostic 1H MR spectroscopy examinations were analyzed using commercially available software (MRUI/VARPRO) and classified using linear discriminant analysis as World Health Organization (WHO) Grade I/II, WHO Grade III, or WHO Grade IV lesions. In all cases, the classification results were matched with histopathological diagnoses that were made according to the WHO classification criteria after serial stereotactic biopsy procedures or open surgery. Histopathological studies revealed 30 Grade I/II tumors, 29 Grade III tumors, and 31 Grade IV tumors. The reliability of the histological diagnoses was validated considering a minimum postsurgical follow-up period of 12 months (range 12-37 months). Classifications based on spectroscopic data yielded 31 tumors in Grade I/II, 32 in Grade III, and 27 in Grade IV. Incorrect classifications included two Grade II tumors, one of which was identified as Grade III and one as Grade IV; two Grade III tumors identified as Grade II; two Grade III lesions identified as Grade IV; and six Grade IV tumors identified as Grade III. Furthermore, one glioblastoma (WHO Grade IV) was classified as WHO Grade I/II. This represents an overall success rate of 86%, and a 95% success rate in differentiating low-grade from high-grade tumors. CONCLUSIONS: The authors conclude that in vivo 1H MR spectroscopy is a reliable technique for grading neuroepithelial brain tumors.
Resumo:
PURPOSE: Venlafaxine has shown benefit in the treatment of depression and pain. Worldwide data are extensively lacking investigating the outcome of chronic pain patients with depressive symptoms treated by venlafaxine in the primary care setting. This observational study aimed to elucidate the efficacy of venlafaxine and its prescription by Swiss primary care physicians and psychiatrists in patients with chronic pain and depressive symptomatology. SUBJECTS AND METHODS: We studied 505 patients with depressive symptoms suffering from chronic pain in a prospective naturalistic Swiss community based observational trial with venlafaxine in primary care. These patients have been treated with venlafaxine by 122 physicians, namely psychiatrists, general practitioners, and internists. RESULTS: On average, patients were treated with 143+/-75 mg (0-450 mg) venlafaxine daily for a follow-up of three months. Venlafaxine proved to be beneficial in the treatment of both depressive symptoms and chronic pain. DISCUSSION: Although side effects were absent in most patients, physicians might have frequently omitted satisfactory response rate of depression by underdosing venlafaxine. Our results reflect the complexity in the treatment of chronic pain in patients with depressive symptoms in primary care. CONCLUSION: Further randomized dose-finding studies are needed to learn more about the appropriate dosage in treating depression and comorbid pain with venlafaxine.
Resumo:
Metals price risk management is a key issue related to financial risk in metal markets because of uncertainty of commodity price fluctuation, exchange rate, interest rate changes and huge price risk either to metals’ producers or consumers. Thus, it has been taken into account by all participants in metal markets including metals’ producers, consumers, merchants, banks, investment funds, speculators, traders and so on. Managing price risk provides stable income for both metals’ producers and consumers, so it increases the chance that a firm will invest in attractive projects. The purpose of this research is to evaluate risk management strategies in the copper market. The main tools and strategies of price risk management are hedging and other derivatives such as futures contracts, swaps and options contracts. Hedging is a transaction designed to reduce or eliminate price risk. Derivatives are financial instruments, whose returns are derived from other financial instruments and they are commonly used for managing financial risks. Although derivatives have been around in some form for centuries, their growth has accelerated rapidly during the last 20 years. Nowadays, they are widely used by financial institutions, corporations, professional investors, and individuals. This project is focused on the over-the-counter (OTC) market and its products such as exotic options, particularly Asian options. The first part of the project is a description of basic derivatives and risk management strategies. In addition, this part discusses basic concepts of spot and futures (forward) markets, benefits and costs of risk management and risks and rewards of positions in the derivative markets. The second part considers valuations of commodity derivatives. In this part, the options pricing model DerivaGem is applied to Asian call and put options on London Metal Exchange (LME) copper because it is important to understand how Asian options are valued and to compare theoretical values of the options with their market observed values. Predicting future trends of copper prices is important and would be essential to manage market price risk successfully. Therefore, the third part is a discussion about econometric commodity models. Based on this literature review, the fourth part of the project reports the construction and testing of an econometric model designed to forecast the monthly average price of copper on the LME. More specifically, this part aims at showing how LME copper prices can be explained by means of a simultaneous equation structural model (two-stage least squares regression) connecting supply and demand variables. A simultaneous econometric model for the copper industry is built: {█(Q_t^D=e^((-5.0485))∙P_((t-1))^((-0.1868) )∙〖GDP〗_t^((1.7151) )∙e^((0.0158)∙〖IP〗_t ) @Q_t^S=e^((-3.0785))∙P_((t-1))^((0.5960))∙T_t^((0.1408))∙P_(OIL(t))^((-0.1559))∙〖USDI〗_t^((1.2432))∙〖LIBOR〗_((t-6))^((-0.0561))@Q_t^D=Q_t^S )┤ P_((t-1))^CU=e^((-2.5165))∙〖GDP〗_t^((2.1910))∙e^((0.0202)∙〖IP〗_t )∙T_t^((-0.1799))∙P_(OIL(t))^((0.1991))∙〖USDI〗_t^((-1.5881))∙〖LIBOR〗_((t-6))^((0.0717) Where, Q_t^D and Q_t^Sare world demand for and supply of copper at time t respectively. P(t-1) is the lagged price of copper, which is the focus of the analysis in this part. GDPt is world gross domestic product at time t, which represents aggregate economic activity. In addition, industrial production should be considered here, so the global industrial production growth that is noted as IPt is included in the model. Tt is the time variable, which is a useful proxy for technological change. A proxy variable for the cost of energy in producing copper is the price of oil at time t, which is noted as POIL(t ) . USDIt is the U.S. dollar index variable at time t, which is an important variable for explaining the copper supply and copper prices. At last, LIBOR(t-6) is the 6-month lagged 1-year London Inter bank offering rate of interest. Although, the model can be applicable for different base metals' industries, the omitted exogenous variables such as the price of substitute or a combined variable related to the price of substitutes have not been considered in this study. Based on this econometric model and using a Monte-Carlo simulation analysis, the probabilities that the monthly average copper prices in 2006 and 2007 will be greater than specific strike price of an option are defined. The final part evaluates risk management strategies including options strategies, metal swaps and simple options in relation to the simulation results. The basic options strategies such as bull spreads, bear spreads and butterfly spreads, which are created by using both call and put options in 2006 and 2007 are evaluated. Consequently, each risk management strategy in 2006 and 2007 is analyzed based on the day of data and the price prediction model. As a result, applications stemming from this project include valuing Asian options, developing a copper price prediction model, forecasting and planning, and decision making for price risk management in the copper market.
Resumo:
In 1998-2001 Finland suffered the most severe insect outbreak ever recorded, over 500,000 hectares. The outbreak was caused by the common pine sawfly (Diprion pini L.). The outbreak has continued in the study area, Palokangas, ever since. To find a good method to monitor this type of outbreaks, the purpose of this study was to examine the efficacy of multi-temporal ERS-2 and ENVISAT SAR imagery for estimating Scots pine (Pinus sylvestris L.) defoliation. Three methods were tested: unsupervised k-means clustering, supervised linear discriminant analysis (LDA) and logistic regression. In addition, I assessed if harvested areas could be differentiated from the defoliated forest using the same methods. Two different speckle filters were used to determine the effect of filtering on the SAR imagery and subsequent results. The logistic regression performed best, producing a classification accuracy of 81.6% (kappa 0.62) with two classes (no defoliation, >20% defoliation). LDA accuracy was with two classes at best 77.7% (kappa 0.54) and k-means 72.8 (0.46). In general, the largest speckle filter, 5 x 5 image window, performed best. When additional classes were added the accuracy was usually degraded on a step-by-step basis. The results were good, but because of the restrictions in the study they should be confirmed with independent data, before full conclusions can be made that results are reliable. The restrictions include the small size field data and, thus, the problems with accuracy assessment (no separate testing data) as well as the lack of meteorological data from the imaging dates.
Resumo:
As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.
Resumo:
The purpose of this project was to investigate the effect of using of data collection technology on student attitudes towards science instruction. The study was conducted over the course of two years at Madison High School in Adrian, Michigan, primarily in college preparatory physics classes, but also in one college preparatory chemistry class and one environmental science class. A preliminary study was conducted at a Lenawee County Intermediate Schools student summer environmental science day camp. The data collection technology used was a combination of Texas Instruments TI-84 Silver Plus graphing calculators and Vernier LabPro data collection sleds with various probeware attachments, including motion sensors, pH probes and accelerometers. Students were given written procedures for most laboratory activities and were provided with data tables and analysis questions to answer about the activities. The first year of the study included a pretest and posttest measuring student attitudes towards the class they were enrolled in. Pre-test and post-test data were analyzed to determine effect size, which was found to be very small (Coe, 2002). The second year of the study focused only on a physics class and used Keller’s ARCS model for measuring student motivation based on the four aspects of motivation: Attention, Relevance, Confidence and Satisfaction (Keller, 2010). According to this model, it was found that there were two distinct groups in the class, one of which was motivated to learn and the other that was not. The data suggest that the use of data collection technology in science classes should be started early in a student’s career, possibly in early middle school or late elementary. This would build familiarity with the equipment and allow for greater exploration by the student as they progress through high school and into upper level science courses.
Resumo:
Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.