917 resultados para OPTICAL-MODEL ANALYSIS
Resumo:
In this paper, we present a novel technique for the removal of astigmatism in submillimeter-wave optical systems through employment of a specific combination of so-called astigmatic off-axis reflectors. This technique treats an orthogonally astigmatic beam using skew Gaussian beam analysis, from which an anastigmatic imaging network is derived. The resultant beam is considered truly stigmatic, with all Gaussian beam parameters in the orthogonal directions being matched. This is thus considered an improvement over previous techniques wherein a beam corrected for astigmatism has only the orthogonal beam amplitude radii matched, with phase shift and phase radius of curvature not considered. This technique is computationally efficient, negating the requirement for computationally intensive numerical analysis of shaped reflector surfaces. The required optical surfaces are also relatively simple to implement compared to such numerically optimized shaped surfaces. This technique is implemented in this work as part of the complete optics train for the STEAMR antenna. The STEAMR instrument is envisaged as a mutli-beam limb sounding instrument operating at submillimeter wavelengths. The antenna optics arrangement for this instrument uses multiple off-axis reflectors to control the incident radiation and couple them to their corresponding receiver feeds. An anastigmatic imaging network is successfully implemented into an optical model of this antenna, and the resultant design ensures optimal imaging of the beams to the corresponding feed horns. This example also addresses the challenges of imaging in multi-beam antenna systems.
Resumo:
A multi-model analysis of Atlantic multidecadal variability is performed with the following aims: to investigate the similarities to observations; to assess the strength and relative importance of the different elements of the mechanism proposed by Delworth et al. (J Clim 6:1993–2011, 1993) (hereafter D93) among coupled general circulation models (CGCMs); and to relate model differences to mean systematic error. The analysis is performed with long control simulations from ten CGCMs, with lengths ranging between 500 and 3600 years. In most models the variations of sea surface temperature (SST) averaged over North Atlantic show considerable power on multidecadal time scales, but with different periodicity. The SST variations are largest in the mid-latitude region, consistent with the short instrumental record. Despite large differences in model configurations, we find quite some consistency among the models in terms of processes. In eight of the ten models the mid-latitude SST variations are significantly correlated with fluctuations in the Atlantic meridional overturning circulation (AMOC), suggesting a link to northward heat transport changes. Consistent with this link, the three models with the weakest AMOC have the largest cold SST bias in the North Atlantic. There is no linear relationship on decadal timescales between AMOC and North Atlantic Oscillation in the models. Analysis of the key elements of the D93 mechanisms revealed the following: Most models present strong evidence that high-latitude winter mixing precede AMOC changes. However, the regions of wintertime convection differ among models. In most models salinity-induced density anomalies in the convective region tend to lead AMOC, while temperature-induced density anomalies lead AMOC only in one model. However, analysis shows that salinity may play an overly important role in most models, because of cold temperature biases in their relevant convective regions. In most models subpolar gyre variations tend to lead AMOC changes, and this relation is strong in more than half of the models.
Resumo:
Objective. The goal of this study is to characterize the current workforce of CIHs, the lengths of professional practice careers of the past and current CIHs.^ Methods. This is a secondary data analysis of data compiled from all of the nearly 50 annual roster listings of the American Board of Industrial Hygiene (ABIH) for Certified Industrial Hygienists active in each year since 1960. Survival analysis was performed as a technique to measure the primary outcome of interest. The technique which was involved in this study was the Kaplan-Meier method for estimating the survival function.^ Study subjects: The population to be studied is all Certified Industrial Hygienists (CIHs). A CIH is defined by the ABIH as an individual who has achieved the minimum requirements for education, working experience and through examination, has demonstrated a minimum level of knowledge and competency in the prevention of occupational illnesses. ^ Results. A Cox-proportional hazards model analysis was performed by different start-time cohorts of CIHs. In this model we chose cohort 1 as the reference cohort. The estimated relative risk of the event (defined as retirement, or absent from 5 consecutive years of listing) occurred for CIHs for cohorts 2,3,4,5 relative to cohort 1 is 0.385, 0.214, 0.234, 0.299 relatively. The result show that cohort 2 (CIHs issued from 1970-1980) has the lowest hazard ratio which indicates the lowest retirement rate.^ Conclusion. The manpower of CIHs (still actively practicing up to the end of 2009) increased tremendously starting in 1980 and grew into a plateau in recent decades. This indicates that the supply and demand of the profession may have reached equilibrium. More demographic information and variables are needed to actually predict the future number of CIHs needed. ^
Resumo:
Innovations in the current interconnected world of organizations have lead to a focus on business models as a fundamental statement of direction and identity. Although industry transformations generally emanate from technological changes, recent examples suggest they may also be due to the introduction of new business models. In the past, different types of airline business models could be clearly separated from each other. However, this has changed in recent years partly due to the concentration process and partly to reaction caused by competitive pressure. At least it can be concluded that in future the distinction of different business models will remain less clear. To advance the use of business models as a concept, it is essential to be able to compare and perform analyses to identify the business models that may have the highest potential. This can essentially contribute to understanding the synergies and incompatibilities in the case of two airlines that are going in for a merger. This is illustrated by the example of Swiss Air-Lufthansa merger analysis. The idea is to develop quantitative methods and tools for comparing and analyzing Aeronautical/Airline business models. The paper identifies available methods of comparing airline business models and lays the ground work for a quantitative model of comparing airline business models. This can be a useful tool for business model analysis when two airlines are merged
Resumo:
Purpose: To evaluate the possible associations between corneal biomechanical parameters, optic disc morphology, and retinal nerve fiber layer (RNFL) thickness in healthy white Spanish children. Methods: This cross-sectional study included 100 myopic children and 99 emmetropic children as a control group, ranging in age from 6 to 17 years. The Ocular Response Analyzer was used to measure corneal hysteresis (CH) and corneal resistance factor. The optic disc morphology and RNFL thickness were assessed using posterior segment optical coherence tomography (Cirrus HD-OCT). The axial length was measured using an IOLMaster, whereas the central corneal thickness was measured by anterior segment optical coherence tomography (Visante OCT). Results: The mean (±SD) age and spherical equivalent were 12.11 (±2.76) years and −3.32 (±2.32) diopters for the myopic group and 11.88 (±2.97) years and +0.34 (±0.41) diopters for the emmetropic group. In a multivariable mixed-model analysis in myopic children, the average RNFL thickness and rim area correlated positively with CH (p = 0.007 and p = 0.001, respectively), whereas the average cup-to-disc area ratio correlated negatively with CH (p = 0.01). We did not observe correlation between RNFL thickness and axial length (p = 0.05). Corneal resistance factor was only positively correlated with the rim area (p = 0.001). The central corneal thickness did not correlate with the optic nerve parameters or with RNFL thickness. These associations were not found in the emmetropic group (p > 0.05 for all). Conclusions: The corneal biomechanics characterized with the Ocular Response Analyzer system are correlated with the optic disc profile and RNFL thickness in myopic children. Low CH values may indicate a reduction in the viscous dampening properties of the cornea and the sclera, especially in myopic children.
Resumo:
There has been an increased demand for characterizing user access patterns using web mining techniques since the informative knowledge extracted from web server log files can not only offer benefits for web site structure improvement but also for better understanding of user navigational behavior. In this paper, we present a web usage mining method, which utilize web user usage and page linkage information to capture user access pattern based on Probabilistic Latent Semantic Analysis (PLSA) model. A specific probabilistic model analysis algorithm, EM algorithm, is applied to the integrated usage data to infer the latent semantic factors as well as generate user session clusters for revealing user access patterns. Experiments have been conducted on real world data set to validate the effectiveness of the proposed approach. The results have shown that the presented method is capable of characterizing the latent semantic factors and generating user profile in terms of weighted page vectors, which may reflect the common access interest exhibited by users among same session cluster.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
This study examines the effect of blood absorption on the endogenous fluorescence signal intensity of biological tissues. Experimental studies were conducted to identify these effects. To register the fluorescence intensity, the fluorescence spectroscopy method was employed. The intensity of the blood flow was measured by laser Doppler flowmetry. We proposed one possible implementation of the Monte Carlo method for the theoretical analysis of the effect of blood on the fluorescence signals. The simulation is constructed as a four-layer skin optical model based on the known optical parameters of the skin with different levels of blood supply. With the help of the simulation, we demonstrate how the level of blood supply can affect the appearance of the fluorescence spectra. In addition, to describe the properties of biological tissue, which may affect the fluorescence spectra, we turned to the method of diffuse reflectance spectroscopy (DRS). Using the spectral data provided by the DRS, the tissue attenuation effect can be extracted and used to correct the fluorescence spectra.
Resumo:
This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. ^ To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. ^ Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. ^ The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance. ^
Resumo:
The goal of my Ph.D. thesis is to enhance the visualization of the peripheral retina using wide-field optical coherence tomography (OCT) in a clinical setting.
OCT has gain widespread adoption in clinical ophthalmology due to its ability to visualize the diseases of the macula and central retina in three-dimensions, however, clinical OCT has a limited field-of-view of 300. There has been increasing interest to obtain high-resolution images outside of this narrow field-of-view, because three-dimensional imaging of the peripheral retina may prove to be important in the early detection of neurodegenerative diseases, such as Alzheimer's and dementia, and the monitoring of known ocular diseases, such as diabetic retinopathy, retinal vein occlusions, and choroid masses.
Before attempting to build a wide-field OCT system, we need to better understand the peripheral optics of the human eye. Shack-Hartmann wavefront sensors are commonly used tools for measuring the optical imperfections of the eye, but their acquisition speed is limited by their underlying camera hardware. The first aim of my thesis research is to create a fast method of ocular wavefront sensing such that we can measure the wavefront aberrations at numerous points across a wide visual field. In order to address aim one, we will develop a sparse Zernike reconstruction technique (SPARZER) that will enable Shack-Hartmann wavefront sensors to use as little as 1/10th of the data that would normally be required for an accurate wavefront reading. If less data needs to be acquired, then we can increase the speed at which wavefronts can be recorded.
For my second aim, we will create a sophisticated optical model that reproduces the measured aberrations of the human eye. If we know how the average eye's optics distort light, then we can engineer ophthalmic imaging systems that preemptively cancel inherent ocular aberrations. This invention will help the retinal imaging community to design systems that are capable of acquiring high resolution images across a wide visual field. The proposed model eye is also of interest to the field of vision science as it aids in the study of how anatomy affects visual performance in the peripheral retina.
Using the optical model from aim two, we will design and reduce to practice a clinical OCT system that is capable of imaging a large (800) field-of-view with enhanced visualization of the peripheral retina. A key aspect of this third and final aim is to make the imaging system compatible with standard clinical practices. To this end, we will incorporate sensorless adaptive optics in order to correct the inter- and intra- patient variability in ophthalmic aberrations. Sensorless adaptive optics will improve both the brightness (signal) and clarity (resolution) of features in the peripheral retina without affecting the size of the imaging system.
The proposed work should not only be a noteworthy contribution to the ophthalmic and engineering communities, but it should strengthen our existing collaborations with the Duke Eye Center by advancing their capability to diagnose pathologies of the peripheral retinal.
Resumo:
This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
There is an increasing rate of papillary thyroid carcinomas that may never progress to cause symptoms or death. Predicting outcome and determining tumour aggressiveness could help diminish the number of patients submitted to aggressive treatments. We aimed to evaluate whether markers of the immune system response and of tumour-associated inflammation could predict outcome of differentiated thyroid cancer (DTC) patients. Retrospective cohort study. We studied 399 consecutive patients, including 325 papillary and 74 follicular thyroid carcinomas. Immune cell markers were evaluated using immunohistochemistry, including tumour-associated macrophages (CD68) and subsets of tumour-infiltrating lymphocytes (TIL), such as CD3, CD4, CD8, CD16, CD20, CD45RO, GRANZYME B, CD69 and CD25. We also investigated the expression of cyclooxygenase 2 (COX2) in tumour cells and the presence of concurrent lymphocytic infiltration characterizing chronic thyroiditis. Concurrent lymphocytic infiltration characterizing chronic thyroiditis was observed in 29% of the cases. Among all the immunological parameters evaluated, only the enrichment of CD8+ lymphocytes (P = 0·001) and expression of COX2 (P =0·01) were associated with recurrence. A multivariate model analysis identified CD8+ TIL/COX2 as independent risk factor for recurrence. A multivariate analysis using Cox's proportional-hazards model adjusted for the presence of concurrent chronic thyroiditis demonstrated that the presence of concurrent chronic thyroiditis had no effect on prognostic prediction mediated by CD8+ TIL and COX2. In conclusion, we suggest the use of a relatively simple pathology tool to help select cases that may benefit of a more aggressive approach sparing the majority of patients from unnecessary procedures.
Resumo:
Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física