124 resultados para unified theories and models of strong and electroweak
em Queensland University of Technology - ePrints Archive
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
PURPOSE: Hreceptor (VEGFR) and FGF receptor (FGFR) signaling pathways. EXPERIMENTAL DESIGN: Six different s.c. patient-derived HCC xenografts were implanted into mice. Tumor growth was evaluated in mice treated with brivanib compared with control. The effects of brivanib on apoptosis and cell proliferation were evaluated by immunohistochemistry. The SK-HEP1 and HepG2 cells were used to investigate the effects of brivanib on the VEGFR-2 and FGFR-1 signaling pathways in vitro. Western blotting was used to determine changes in proteins in these xenografts and cell lines. RESULTS: Brivanib significantly suppressed tumor growth in five of six xenograft lines. Furthermore, brivanib-induced growth inhibition was associated with a decrease in phosphorylated VEGFR-2 at Tyr(1054/1059), increased apoptosis, reduced microvessel density, inhibition of cell proliferation, and down-regulation of cell cycle regulators. The levels of FGFR-1 and FGFR-2 expression in these xenograft lines were positively correlated with its sensitivity to brivanib-induced growth inhibition. In VEGF-stimulated and basic FGF stimulated SK-HEP1 cells, brivanib significantly inhibited VEGFR-2, FGFR-1, extracellular signal-regulated kinase 1/2, and Akt phosphorylation. CONCLUSION: This study provides a strong rationale for clinical investigation of brivanib in patients with HCC.
Resumo:
The future vehicle navigation for safety applications requires seamless positioning at the accuracy of sub-meter or better. However, standalone Global Positioning System (GPS) or Differential GPS (DGPS) suffer from solution outages while being used in restricted areas such as high-rise urban areas and tunnels due to the blockages of satellite signals. Smoothed DGPS can provide sub-meter positioning accuracy, but not the seamless requirement. A disadvantage of the traditional navigation aids such as Dead Reckoning and Inertial Measurement Unit onboard vehicles are either not accurate enough due to error accumulation or too expensive to be acceptable by the mass market vehicle users. One of the alternative technologies is to use the wireless infrastructure installed in roadside to locate vehicles in regions where the Global Navigation Satellite Systems (GNSS) signals are not available (for example: inside tunnels, urban canyons and large indoor car parks). The examples of roadside infrastructure which can be potentially used for positioning purposes could include Wireless Local Area Network (WLAN)/Wireless Personal Area Network (WPAN) based positioning systems, Ultra-wide band (UWB) based positioning systems, Dedicated Short Range Communication (DSRC) devices, Locata’s positioning technology, and accurate road surface height information over selected road segments such as tunnels. This research reviews and compares the possible wireless technologies that could possibly be installed along roadside for positioning purposes. Models and algorithms of integrating different positioning technologies are also presented. Various simulation schemes are designed to examine the performance benefits of united GNSS and roadside infrastructure for vehicle positioning. The results from these experimental studies have shown a number of useful findings. It is clear that in the open road environment where sufficient satellite signals can be obtained, the roadside wireless measurements contribute very little to the improvement of positioning accuracy at the sub-meter level, especially in the dual constellation cases. In the restricted outdoor environments where only a few GPS satellites, such as those with 45 elevations, can be received, the roadside distance measurements can help improve both positioning accuracy and availability to the sub-meter level. When the vehicle is travelling in tunnels with known heights of tunnel surfaces and roadside distance measurements, the sub-meter horizontal positioning accuracy is also achievable. Overall, simulation results have demonstrated that roadside infrastructure indeed has the potential to provide sub-meter vehicle position solutions for certain road safety applications if the properly deployed roadside measurements are obtainable.
Resumo:
Objective: To examine the effects of personal and community characteristics, specifically race and rurality, on lengths of state psychiatric hospital and community stays using maximum likelihood survival analysis with a special emphasis on change over a ten year period of time. Data Sources: We used the administrative data of the Virginia Department of Mental Health, Mental Retardation, and Substance Abuse Services (DMHMRSAS) from 1982-1991 and the Area Resources File (ARF). Given these two sources, we constructed a history file for each individual who entered the state psychiatric system over the ten year period. Histories included demographic, treatment, and community characteristics. Study Design: We used a longitudinal, population-based design with maximum likelihood estimation of survival models. We presented a random effects model with unobserved heterogeneity that was independent of observed covariates. The key dependent variables were lengths of inpatient stay and subsequent length of community stay. Explanatory variables measured personal, diagnostic, and community characteristics, as well as controls for calendar time. Data Collection: This study used secondary, administrative, and health planning data. Principal Findings: African-American clients leave the community more quickly than whites. After controlling for other characteristics, however, race does not affect hospital length of stay. Rurality does not affect length of community stays once other personal and community characteristics are controlled for. However, people from rural areas have longer hospital stays even after controlling for personal and community characteristics. The effects of time are significantly smaller than expected. Diagnostic composition effects and a decrease in the rate of first inpatient admissions explain part of this reduced impact of time. We also find strong evidence for the existence of unobserved heterogeneity in both types of stays and adjust for this in our final models. Conclusions: Our results show that information on client characteristics available from inpatient stay records is useful in predicting not only the length of inpatient stay but also the length of the subsequent community stay. This information can be used to target increased discharge planning for those at risk of more rapid readmission to inpatient care. Correlation across observed and unobserved factors affecting length of stay has significant effects on the measurement of relationships between individual factors and lengths of stay. Thus, it is important to control for both observed and unobserved factors in estimation.
Resumo:
The validation of Computed Tomography (CT) based 3D models takes an integral part in studies involving 3D models of bones. This is of particular importance when such models are used for Finite Element studies. The validation of 3D models typically involves the generation of a reference model representing the bones outer surface. Several different devices have been utilised for digitising a bone’s outer surface such as mechanical 3D digitising arms, mechanical 3D contact scanners, electro-magnetic tracking devices and 3D laser scanners. However, none of these devices is capable of digitising a bone’s internal surfaces, such as the medullary canal of a long bone. Therefore, this study investigated the use of a 3D contact scanner, in conjunction with a microCT scanner, for generating a reference standard for validating the internal and external surfaces of a CT based 3D model of an ovine femur. One fresh ovine limb was scanned using a clinical CT scanner (Phillips, Brilliance 64) with a pixel size of 0.4 mm2 and slice spacing of 0.5 mm. Then the limb was dissected to obtain the soft tissue free bone while care was taken to protect the bone’s surface. A desktop mechanical 3D contact scanner (Roland DG Corporation, MDX 20, Japan) was used to digitise the surface of the denuded bone. The scanner was used with the resolution of 0.3 × 0.3 × 0.025 mm. The digitised surfaces were reconstructed into a 3D model using reverse engineering techniques in Rapidform (Inus Technology, Korea). After digitisation, the distal and proximal parts of the bone were removed such that the shaft could be scanned with a microCT (µCT40, Scanco Medical, Switzerland) scanner. The shaft, with the bone marrow removed, was immersed in water and scanned with a voxel size of 0.03 mm3. The bone contours were extracted from the image data utilising the Canny edge filter in Matlab (The Mathswork).. The extracted bone contours were reconstructed into 3D models using Amira 5.1 (Visage Imaging, Germany). The 3D models of the bone’s outer surface reconstructed from CT and microCT data were compared against the 3D model generated using the contact scanner. The 3D model of the inner canal reconstructed from the microCT data was compared against the 3D models reconstructed from the clinical CT scanner data. The disparity between the surface geometries of two models was calculated in Rapidform and recorded as average distance with standard deviation. The comparison of the 3D model of the whole bone generated from the clinical CT data with the reference model generated a mean error of 0.19±0.16 mm while the shaft was more accurate(0.08±0.06 mm) than the proximal (0.26±0.18 mm) and distal (0.22±0.16 mm) parts. The comparison between the outer 3D model generated from the microCT data and the contact scanner model generated a mean error of 0.10±0.03 mm indicating that the microCT generated models are sufficiently accurate for validation of 3D models generated from other methods. The comparison of the inner models generated from microCT data with that of clinical CT data generated an error of 0.09±0.07 mm Utilising a mechanical contact scanner in conjunction with a microCT scanner enabled to validate the outer surface of a CT based 3D model of an ovine femur as well as the surface of the model’s medullary canal.
Resumo:
Any theory of thinking or teaching or learning rests on an underlying philosophy of knowledge. Mathematics education is situated at the nexus of two fields of inquiry, namely mathematics and education. However, numerous other disciplines interact with these two fields which compound the complexity of developing theories that define mathematics education. We first address the issue of clarifying a philosophy of mathematics education before attempting to answer whether theories of mathematics education are constructible? In doing so we draw on the foundational writings of Lincoln and Guba (1994), in which they clearly posit that any discipline within education, in our case mathematics education, needs to clarify for itself the following questions: (1) What is reality? Or what is the nature of the world around us? (2) How do we go about knowing the world around us? [the methodological question, which presents possibilities to various disciplines to develop methodological paradigms] and, (3) How can we be certain in the “truth” of what we know? [the epistemological question]
Resumo:
Purpose: One strategy to minimize bacteria-associated adverse responses such as microbial keratitis, contact lens–induced acute red eye (CLARE), and contact lens induced peripheral ulcers (CLPUs) that occur with contact lens wear is the development of an antimicrobial or antiadhesive contact lens. Cationic peptides represent a novel approach for the development of antimicrobial lenses.---------- Methods: A novel cationic peptide, melimine, was covalently incorporated into silicone hydrogel lenses. Confirmation tests to determine the presence of peptide and anti-microbial activity were performed. Cationic lenses were then tested for their ability to prevent CLPU in the Staphylococcus aureus rabbit model and CLARE in the Pseudomonas aeruginosa guinea pig model. ---------- Results: In the rabbit model of CLPU, melimine-coated lenses resulted in significant reductions in ocular symptom scores and in the extent of corneal infiltration (P < 0.05). Evaluation of the performance of melimine lenses in the CLARE model showed significant improvement in all ocular response parameters measured, including the percentage of eyes with corneal infiltrates, compared with those observed in the eyes fitted with the control lens (P ≤ 0.05). ---------- Conclusions: Cationic coating of contact lenses with the peptide melimine may represent a novel method of prevention of bacterial growth on contact lenses and consequently result in reduction of the incidence and severity of adverse responses due to Gram-positive and -negative bacteria during lens wear.