991 resultados para Armington Assumption


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove that the first complex homology of the Johnson subgroup of the Torelli group Tg is a non-trivial, unipotent Tg-module for all g ≥ 4 and give an explicit presentation of it as a Sym H 1(Tg,C)-module when g ≥ 6. We do this by proving that, for a finitely generated group G satisfying an assumption close to formality, the triviality of the restricted characteristic variety implies that the first homology of its Johnson kernel is a nilpotent module over the corresponding Laurent polynomial ring, isomorphic to the infinitesimal Alexander invariant of the associated graded Lie algebra of G. In this setup, we also obtain a precise nilpotence test. © European Mathematical Society 2014.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Histopathology is the clinical standard for tissue diagnosis. However, histopathology has several limitations including that it requires tissue processing, which can take 30 minutes or more, and requires a highly trained pathologist to diagnose the tissue. Additionally, the diagnosis is qualitative, and the lack of quantitation leads to possible observer-specific diagnosis. Taken together, it is difficult to diagnose tissue at the point of care using histopathology.

Several clinical situations could benefit from more rapid and automated histological processing, which could reduce the time and the number of steps required between obtaining a fresh tissue specimen and rendering a diagnosis. For example, there is need for rapid detection of residual cancer on the surface of tumor resection specimens during excisional surgeries, which is known as intraoperative tumor margin assessment. Additionally, rapid assessment of biopsy specimens at the point-of-care could enable clinicians to confirm that a suspicious lesion is successfully sampled, thus preventing an unnecessary repeat biopsy procedure. Rapid and low cost histological processing could also be potentially useful in settings lacking the human resources and equipment necessary to perform standard histologic assessment. Lastly, automated interpretation of tissue samples could potentially reduce inter-observer error, particularly in the diagnosis of borderline lesions.

To address these needs, high quality microscopic images of the tissue must be obtained in rapid timeframes, in order for a pathologic assessment to be useful for guiding the intervention. Optical microscopy is a powerful technique to obtain high-resolution images of tissue morphology in real-time at the point of care, without the need for tissue processing. In particular, a number of groups have combined fluorescence microscopy with vital fluorescent stains to visualize micro-anatomical features of thick (i.e. unsectioned or unprocessed) tissue. However, robust methods for segmentation and quantitative analysis of heterogeneous images are essential to enable automated diagnosis. Thus, the goal of this work was to obtain high resolution imaging of tissue morphology through employing fluorescence microscopy and vital fluorescent stains and to develop a quantitative strategy to segment and quantify tissue features in heterogeneous images, such as nuclei and the surrounding stroma, which will enable automated diagnosis of thick tissues.

To achieve these goals, three specific aims were proposed. The first aim was to develop an image processing method that can differentiate nuclei from background tissue heterogeneity and enable automated diagnosis of thick tissue at the point of care. A computational technique called sparse component analysis (SCA) was adapted to isolate features of interest, such as nuclei, from the background. SCA has been used previously in the image processing community for image compression, enhancement, and restoration, but has never been applied to separate distinct tissue types in a heterogeneous image. In combination with a high resolution fluorescence microendoscope (HRME) and a contrast agent acriflavine, the utility of this technique was demonstrated through imaging preclinical sarcoma tumor margins. Acriflavine localizes to the nuclei of cells where it reversibly associates with RNA and DNA. Additionally, acriflavine shows some affinity for collagen and muscle. SCA was adapted to isolate acriflavine positive features or APFs (which correspond to RNA and DNA) from background tissue heterogeneity. The circle transform (CT) was applied to the SCA output to quantify the size and density of overlapping APFs. The sensitivity of the SCA+CT approach to variations in APF size, density and background heterogeneity was demonstrated through simulations. Specifically, SCA+CT achieved the lowest errors for higher contrast ratios and larger APF sizes. When applied to tissue images of excised sarcoma margins, SCA+CT correctly isolated APFs and showed consistently increased density in tumor and tumor + muscle images compared to images containing muscle. Next, variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was further tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. The results indicate that SCA+CT can accurately delineate APFs in heterogeneous tissue, which is essential to enable automated and rapid surveillance of tissue pathology.

Two primary challenges were identified in the work in aim 1. First, while SCA can be used to isolate features, such as APFs, from heterogeneous images, its performance is limited by the contrast between APFs and the background. Second, while it is feasible to create mosaics by scanning a sarcoma tumor bed in a mouse, which is on the order of 3-7 mm in any one dimension, it is not feasible to evaluate an entire human surgical margin. Thus, improvements to the microscopic imaging system were made to (1) improve image contrast through rejecting out-of-focus background fluorescence and to (2) increase the field of view (FOV) while maintaining the sub-cellular resolution needed for delineation of nuclei. To address these challenges, a technique called structured illumination microscopy (SIM) was employed in which the entire FOV is illuminated with a defined spatial pattern rather than scanning a focal spot, such as in confocal microscopy.

Thus, the second aim was to improve image contrast and increase the FOV through employing wide-field, non-contact structured illumination microscopy and optimize the segmentation algorithm for new imaging modality. Both image contrast and FOV were increased through the development of a wide-field fluorescence SIM system. Clear improvement in image contrast was seen in structured illumination images compared to uniform illumination images. Additionally, the FOV is over 13X larger than the fluorescence microendoscope used in aim 1. Initial segmentation results of SIM images revealed that SCA is unable to segment large numbers of APFs in the tumor images. Because the FOV of the SIM system is over 13X larger than the FOV of the fluorescence microendoscope, dense collections of APFs commonly seen in tumor images could no longer be sparsely represented, and the fundamental sparsity assumption associated with SCA was no longer met. Thus, an algorithm called maximally stable extremal regions (MSER) was investigated as an alternative approach for APF segmentation in SIM images. MSER was able to accurately segment large numbers of APFs in SIM images of tumor tissue. In addition to optimizing MSER for SIM image segmentation, an optimal frequency of the illumination pattern used in SIM was carefully selected because the image signal to noise ratio (SNR) is dependent on the grid frequency. A grid frequency of 31.7 mm-1 led to the highest SNR and lowest percent error associated with MSER segmentation.

Once MSER was optimized for SIM image segmentation and the optimal grid frequency was selected, a quantitative model was developed to diagnose mouse sarcoma tumor margins that were imaged ex vivo with SIM. Tumor margins were stained with acridine orange (AO) in aim 2 because AO was found to stain the sarcoma tissue more brightly than acriflavine. Both acriflavine and AO are intravital dyes, which have been shown to stain nuclei, skeletal muscle, and collagenous stroma. A tissue-type classification model was developed to differentiate localized regions (75x75 µm) of tumor from skeletal muscle and adipose tissue based on the MSER segmentation output. Specifically, a logistic regression model was used to classify each localized region. The logistic regression model yielded an output in terms of probability (0-100%) that tumor was located within each 75x75 µm region. The model performance was tested using a receiver operator characteristic (ROC) curve analysis that revealed 77% sensitivity and 81% specificity. For margin classification, the whole margin image was divided into localized regions and this tissue-type classification model was applied. In a subset of 6 margins (3 negative, 3 positive), it was shown that with a tumor probability threshold of 50%, 8% of all regions from negative margins exceeded this threshold, while over 17% of all regions exceeded the threshold in the positive margins. Thus, 8% of regions in negative margins were considered false positives. These false positive regions are likely due to the high density of APFs present in normal tissues, which clearly demonstrates a challenge in implementing this automatic algorithm based on AO staining alone.

Thus, the third aim was to improve the specificity of the diagnostic model through leveraging other sources of contrast. Modifications were made to the SIM system to enable fluorescence imaging at a variety of wavelengths. Specifically, the SIM system was modified to enabling imaging of red fluorescent protein (RFP) expressing sarcomas, which were used to delineate the location of tumor cells within each image. Initial analysis of AO stained panels confirmed that there was room for improvement in tumor detection, particularly in regards to false positive regions that were negative for RFP. One approach for improving the specificity of the diagnostic model was to investigate using a fluorophore that was more specific to staining tumor. Specifically, tetracycline was selected because it appeared to specifically stain freshly excised tumor tissue in a matter of minutes, and was non-toxic and stable in solution. Results indicated that tetracycline staining has promise for increasing the specificity of tumor detection in SIM images of a preclinical sarcoma model and further investigation is warranted.

In conclusion, this work presents the development of a combination of tools that is capable of automated segmentation and quantification of micro-anatomical images of thick tissue. When compared to the fluorescence microendoscope, wide-field multispectral fluorescence SIM imaging provided improved image contrast, a larger FOV with comparable resolution, and the ability to image a variety of fluorophores. MSER was an appropriate and rapid approach to segment dense collections of APFs from wide-field SIM images. Variables that reflect the morphology of the tissue, such as the density, size, and shape of nuclei and nucleoli, can be used to automatically diagnose SIM images. The clinical utility of SIM imaging and MSER segmentation to detect microscopic residual disease has been demonstrated by imaging excised preclinical sarcoma margins. Ultimately, this work demonstrates that fluorescence imaging of tissue micro-anatomy combined with a specialized algorithm for delineation and quantification of features is a means for rapid, non-destructive and automated detection of microscopic disease, which could improve cancer management in a variety of clinical scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study sought to understand the phenomenon of faculty involvement in indirect cost under-recovery. The focus of the study was on public research university STEM (science, technology, engineering and mathematics) faculty, and their perspectives on, and behavior towards, a higher education fiscal policy. The explanatory scheme was derived from anthropological theory, and incorporated organizational culture, faculty socialization, and political bargaining models in the conceptual framework. This study drew on two key assumptions. The first assumption was that faculty understanding of, and behavior toward, indirect cost recovery represents values, beliefs, and choices drawn from the distinct professional socialization and distinct culture of faculty. The second assumption was that when faculty and institutional administrators are in conflict over indirect cost recovery, the resultant formal administrative decision comes about through political bargaining over critical resources. The research design was a single site, qualitative case study with a focus on learning the meaning of the phenomenon as understood by the informants. In this study the informants were tenured and tenure track research university faculty in the STEM fields who were highly successful at obtaining Federal sponsored research funds, with individual sponsored research portfolios of at least one million dollars. The data consisted of 11 informant interviews, bolstered by documentary evidence. The findings indicated that faculty socialization and organizational culture were the most dominant themes, while political bargaining emerged as significantly less prominent. Public research university STEM faculty are most concerned about the survival of their research programs and the discovery facilitated by their research programs. They resort to conjecture when confronted by the issue of indirect cost recovery. The findings direct institutional administrators to consider less emphasis on compliance and hierarchy when working with expert professionals such as science faculty. Instead a more effective focus might be on communication and clarity in budget processes and organizational decision-making, and a concentration on critical administrative support that can relieve faculty administrative burdens. For higher education researchers, the findings suggest that we need to create more sophisticated models to help us understand organizations dependent on expert professionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Belgium, gender-parity has been compulsory for all party lists (in local, regional, federal and European elections) for several years. As a result, the proportion of women has risen from a fourth up to a third of the deputies. Yet, strict parity is still far from realised. This article seeks to establish what causes this glass ceiling, namely the parties' reluctance to place female candidates in the top positions or even as the front-runner. In a proportional representation system with half-open lists, and especially when the constituencies are small, this automatically leads to a smaller proportion of women among the elected deputies. One important reason for the parties' reluctance to rank female candidates higher is their assumption that women are less effective as "election locomotives" than men. However, the analysis of the Belgian election results makes clear that this is not the case. Female candidates in top positions are as successful as their male counterparts. © (2008) Swiss Political Science Review.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This short conference paper serves as a distillation of a keynote address delivered at the the Second National Conference on Management and Higher Education Trends & Strategies for Management & Administration hosted by Bangkok-based Stamford International University (Thailand) on November 1, 2014.Innovation is discussed as the heart of entrepreneurial processes occurring in today's capitalist economic systems, including transition economies like China and Vietnam, which underscores economic competitiveness of firms and economies. But the innovation effort and process also face dilemma of "entrepreneurial curse of innovation". Advantages and disadvantages are weighed for a more balanced view, especially in the context of outnumbering SMEs and given existence of pitfalls and traps along the innovation path of development. Toward the end, the value of the market is once again stressed amid the concern of subjective assumption and illusion about availability of market opportunities in the mind of innovators, which may contrast totally with the dismal outcome the actual market realities may show ex post.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is part of a collaborative project being undertaken by the three leading universities of Brussels, VUB, ULB and USL-B supported by Innnoviris. The project called Media Clusters Brussels - MCB - started in October 2014 with the goal to analyze the development of a Media Park around the two public broadcasters at the site of Reyers in Brussels being host of a media cluster in the capital city. Not only policymakers but also many authors recognized in the last decade that the media industry is characterized from a geographical point of view by a heavy concentration to a limited number of large cities, where media clusters have emerged (Karlsson & Picard, 2011). The common assumption about media clusters is that locating inside a regional agglomeration of related actors brings advantages for these firms. Especially, the interrelations and interactions between the actors on a social level matter for the shape and efficiency of the agglomerations (Picard, 2008). However, even though the importance of the actors and their interrelations has been a common assumption, many authors solely focus on the macro-economical aspects of the clusters. Within this paper, we propose to realize a socio-economical analysis of media clusters to make informed decisions in the development and so, bring the social (human) factor back into scope. Therefore, this article focuses on the development of a novel valuable framework, the so-called 7P framework with a multilevel and interdisciplinary approach, which includes three aspects, which have been identified as emerging success-factors of media clusters: partnerships, (media) professionals and positive spillovers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is widely accepted that volumetric contraction and solidification during the polymerization process of restorative composites in combination with bonding to the hard tissue result in stress transfer and inward deformation of the cavity walls of the restored tooth. Deformation of the walls decreases the size of the cavity during the filling process. This fact has a profound influence on the assumption-raised and discussed in this paper-that an incremental filling technique reduces the stress effect of composite shrinkage on the tooth. Developing stress fields for different incremental filling techniques are simulated in a numerical analysis. The analysis shows that, in a restoration with a well-established bond to the tooth-as is generally desired-incremental filling techniques increase the deformation of the restored tooth. The increase is caused by the incremental deformation of the preparation, which effectively decreases the total amount of composite needed to fill the cavity. This leads to a higher-stressed tooth-composite structure. The study also shows that the assessment of intercuspal distance measurements as well as simplifications based on generalization of the shrinkage stress state cannot be sufficient to characterize the effect of polymerization shrinkage in a tooth-restoration complex. Incremental filling methods may need to be retained for reasons such as densification, adaptation, thoroughness of cure, and bond formation. However, it is very difficult to prove that incrementalization needs to be retained because of the abatement of shrinkage effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The key problems in discussing stochastic monotonicity and duality for continuous time Markov chains are to give the criteria for existence and uniqueness and to construct the associated monotone processes in terms of their infinitesimal q -matrices. In their recent paper, Chen and Zhang [6] discussed these problems under the condition that the given q-matrix Q is conservative. The aim of this paper is to generalize their results to a more general case, i.e., the given q-matrix Q is not necessarily conservative. New problems arise 'in removing the conservative assumption. The existence and uniqueness criteria for this general case are given in this paper. Another important problem, the construction of all stochastically monotone Q-processes, is also considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The key problems in discussing duality and monotonicity for continuous-time Markov chains are to find conditions for existence and uniqueness and then to construct corresponding processes in terms of infinitesimal characteristics, i.e., q-matrices. Such problems are solved in this paper under the assumption that the given q-matrix is conservative. Some general properties of stochastically monotone Q-process ( Q is not necessarily conservative) are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electromagnetic levitation of electrically conductive droplets by alternating magnetic fields is a technique used to measure the physical properties of liquid metallic alloys such as surface tension or viscosity. Experiments can be conducted under terrestrial conditions or in microgravity, to reduce electromagnetic stirring and shaping of the droplet. Under such conditions, the time-dependent behaviour of a point of the free surface is recorded. Then the signal is analysed considering the droplet as a harmonic damped oscillator. We use a spectral code, for fluid flow and free surface descriptions, to check the validity of this assumption for two cases. First when the motion inside the droplet is generated by its initial distortion only and second, when the droplet is located in a uniform magnetic field originating far from the droplet. It is found that some deviations exist which can lead to an overestimate of the value of viscosity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Melting of metallic samples in a cold crucible causes inclusions to concentrate on the surface owing to the action of the electromagnetic force in the skin layer. This process is dynamic, involving the melting stage, then quasi-stationary particle separation, and finally the solidification in the cold crucible. The proposed modeling technique is based on the pseudospectral solution method for coupled turbulent fluid flow, thermal and electromagnetic fields within the time varying fluid volume contained by the free surface, and partially the solid crucible wall. The model uses two methods for particle tracking: (1) a direct Lagrangian particle path computation and (2) a drifting concentration model. Lagrangian tracking is implemented for arbitrary unsteady flow. A specific numerical time integration scheme is implemented using implicit advancement that permits relatively large time-steps in the Lagrangian model. The drifting concentration model is based on a local equilibrium drift velocity assumption. Both methods are compared and demonstrated to give qualitatively similar results for stationary flow situations. The particular results presented are obtained for iron alloys. Small size particles of the order of 1 μm are shown to be less prone to separation by electromagnetic field action. In contrast, larger particles, 10 to 100 μm, are easily “trapped” by the electromagnetic field and stay on the sample surface at predetermined locations depending on their size and properties. The model allows optimization for melting power, geometry, and solidification rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present recession has prompted scholarly and journalistic questioning of the contributions of the cultural industries to the economy. The talent-rich metropolitan clusters of London and New York are well-placed to ride out a thoroughgoing shakeup of the media markets if they manage their infrastructure, space and resources strategically, as Richard Florida has recently argued. This seems to be the assumption behind the recent Digital Britain interim report, and Gordon Brown's remarks that a digital revolution "lies at the heart" of Britain's economic recovery and that broadband and the media industry can play a leading role in pulling the UK out of the recession. Focusing on the Digital Britain report and consultation documents, this presentation seeks to unpack some of the fundamental assumptions behind this link between digital infrastructure, creativity and profitability. In particular the implicit notion of an engaged audience of users, generating "content" as well as shaping new media platforms calls into question long-held theoretical constructions of the mass audience of consumers as spectators; instead, the audience emerges as a potential economic powerhouse, an underused resource for tomorrow's cultural industries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present recession has prompted scholarly and journalistic questioning of the contributions of the cultural industries to the economy. The talent-rich metropolitan clusters of London and New York are well-placed to ride out a thoroughgoing shakeup of the media markets if they manage their infrastructure, space and resources strategically, as Richard Florida has recently argued. This seems to be the assumption behind the recent Digital Britain interim report, and Gordon Brown's remarks that a digital revolution "lies at the heart" of Britain's economic recovery and that broadband and the media industry can play a leading role in pulling the UK out of the recession. Focusing on the Digital Britain report and consultation documents, this presentation seeks to unpack some of the fundamental assumptions behind this link between digital infrastructure, creativity and profitability. In particular the implicit notion of an engaged audience of users, generating "content" as well as shaping new media platforms calls into question longheld theoretical constructions of the mass audience of consumers as spectators. [From the Author]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present recession has prompted scholarly and journalistic questioning of the contributions of the cultural industries to the economy. The talent-rich metropolitan clusters of London and New York are well-placed to ride out a thoroughgoing shakeup of the media markets if they manage their infrastructure, space and resources strategically, as Richard Florida has recently argued. This seems to be the assumption behind the recent Digital Britain interim report, and the prime minister's remarks that a digital revolution "lies at the heart" of Britain's economic recovery and that broadband and the media industry can play a leading role in pulling the UK out of the recession. Focusing on the Digital Britain interim report, this presentation seeks to clarify some of the fundamental assumptions behind this link between digital infrastructure, creativity and profitability. [From the Author]