22 resultados para Enterprise Resource Planning Systems
em Université de Lausanne, Switzerland
Resumo:
PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).
Resumo:
Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.
Resumo:
The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.
Resumo:
The discipline of Enterprise Architecture Management (EAM) deals with the alignment of business and information systems architectures. While EAM has long been regarded as a discipline for IT managers this book takes a different stance: It explains how top executives can use EAM for leveraging their strategic planning and controlling processes and how EAM can contribute to sustainable competitive advantage. Based on the analysis of best practices from eight leading European companies from various industries the book presents crucial elements of successful EAM. It outlines what executives need to do in terms of governance, processes, methodologies and culture in order to bring their management to the next level. Beyond this, the book points how EAM might develop in the next decade allowing today's managers to prepare for the future of architecture management.
Resumo:
1. Wind pollination is thought to have evolved in response to selection for mechanisms to promote pollination success, when animal pollinators become scarce or unreliable. We might thus expect wind-pollinated plants to be less prone to pollen limitation than their insect-pollinated counterparts. Yet, if pollen loads on stigmas of wind-pollinated species decline with distance from pollen donors, seed set might nevertheless be pollen-limited in populations of plants that cannot self-fertilize their progeny, but not in self-compatible hermaphroditic populations.2. Here, we test this hypothesis by comparing pollen limitation between dioecious and hermaphroditic (monoecious) populations of the wind-pollinated herb Mercurialis annua.3. In natural populations, seed set was pollen-limited in low-density patches of dioecious, but not hermaphroditic, M. annua, a finding consistent with patterns of distance-dependent seed set by females in an experimental array. Nevertheless, seed set was incomplete in both dioecious and hermaphroditic populations, even at high local densities. Further, both factors limited the seed set of females and hermaphrodites, after we manipulated pollen and resource availability in a common garden experiment.4. Synthesis. Our results are consistent with the idea that pollen limitation plays a role in the evolution of combined vs. separate sexes in M. annua. Taken together, they point to the potential importance of pollen transfer between flowers on the same plant (geitonogamy) by wind as a mechanism of reproductive assurance and to the dual roles played by pollen and resource availability in limiting seed set. Thus, seed set can be pollen-limited in sparse populations of a wind-pollinated species, where mates are rare or absent, having potentially important demographic and evolutionary implications.
Resumo:
PURPOSE: Virtual planning and guided surgery with or without prebent or milled plates are becoming more and more common for mandibular reconstruction with fibular free flaps (FFFs). Although this excellent surgical option is being used more widely, the question of the additional cost of planning and cutting-guide production has to be discussed. In capped payment systems such additional costs have to be offset by other savings if there are no special provisions for extra funding. Our study was designed to determine whether using virtual planning and guided surgery resulted in time saved during surgery and whether this time gain resulted in self-funding of such planning through the time saved. MATERIALS AND METHODS: All consecutive cases of FFF surgery were evaluated during a 2-year period. Institutional data were used to determine the price of 1 minute of operative time. The time for fibula molding, plate adaptation, and insetting was recorded. RESULTS: During the defined period, we performed 20 mandibular reconstructions using FFFs, 9 with virtual planning and guided surgery and 11 freehand cases. One minute of operative time was calculated to cost US $47.50. Multiplying this number by the time saved, we found that the additional cost of virtual planning was reduced from US $5,098 to US $1,231.50 with a prebent plate and from US $6,980 to US $3,113.50 for a milled plate. CONCLUSIONS: Even in capped health care systems, virtual planning and guided surgery including prebent or milled plates are financially viable.
Resumo:
The MyHits web server (http://myhits.isb-sib.ch) is a new integrated service dedicated to the annotation of protein sequences and to the analysis of their domains and signatures. Guest users can use the system anonymously, with full access to (i) standard bioinformatics programs (e.g. PSI-BLAST, ClustalW, T-Coffee, Jalview); (ii) a large number of protein sequence databases, including standard (Swiss-Prot, TrEMBL) and locally developed databases (splice variants); (iii) databases of protein motifs (Prosite, Interpro); (iv) a precomputed list of matches ('hits') between the sequence and motif databases. All databases are updated on a weekly basis and the hit list is kept up to date incrementally. The MyHits server also includes a new collection of tools to generate graphical representations of pairwise and multiple sequence alignments including their annotated features. Free registration enables users to upload their own sequences and motifs to private databases. These are then made available through the same web interface and the same set of analytical tools. Registered users can manage their own sequences and annotations using only web tools and freeze their data in their private database for publication purposes.
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
Resumo:
Even though architecture principles were first discussed in the 1990s, they are still perceived as an underexplored topic in enterprise architecture management research. By now, there is an increasing consensus about EA principles' nature, as well as guidelines for their formulation. However, the extant literature remains vague about what can be considered suitable EA design and evolution guidance principles. In addition, empirical insights regarding their role and usefulness in practice are still lacking. Accordingly, this research seeks to address three questions: (1) What are suitable principles to guide EA design and evolution? (2) What usage do EA principles have for practitioners? (3) Which propositions can be derived regarding EA principles' role and application? Opting for exploratory research, we apply a research process covering critical analysis of current publications as well as capturing experts' perceptions. Our research ontologically distinguishes between principles from nonprinciples, proposes a validated set of meta-principles, and clarifies principles' application, role, and usefulness in practice. The explored insights can be used as guidelines in defining suitable principles and turning them into an effective bridge between strategy and design and a guide in design decisions.
Resumo:
BACKGROUND: The Complete Arabidopsis Transcript MicroArray (CATMA) initiative combines the efforts of laboratories in eight European countries 1 to deliver gene-specific sequence tags (GSTs) for the Arabidopsis research community. The CATMA initiative offers the power and flexibility to regularly update the GST collection according to evolving knowledge about the gene repertoire. These GST amplicons can easily be reamplified and shared, subsets can be picked at will to print dedicated arrays, and the GSTs can be cloned and used for other functional studies. This ongoing initiative has already produced approximately 24,000 GSTs that have been made publicly available for spotted microarray printing and RNA interference. RESULTS: GSTs from the CATMA version 2 repertoire (CATMAv2, created in 2002) were mapped onto the gene models from two independent Arabidopsis nuclear genome annotation efforts, TIGR5 and PSB-EuGène, to consolidate a list of genes that were targeted by previously designed CATMA tags. A total of 9,027 gene models were not tagged by any amplified CATMAv2 GST, and 2,533 amplified GSTs were no longer predicted to tag an updated gene model. To validate the efficacy of GST mapping criteria and design rules, the predicted and experimentally observed hybridization characteristics associated to GST features were correlated in transcript profiling datasets obtained with the CATMAv2 microarray, confirming the reliability of this platform. To complete the CATMA repertoire, all 9,027 gene models for which no GST had yet been designed were processed with an adjusted version of the Specific Primer and Amplicon Design Software (SPADS). A total of 5,756 novel GSTs were designed and amplified by PCR from genomic DNA. Together with the pre-existing GST collection, this new addition constitutes the CATMAv3 repertoire. It comprises 30,343 unique amplified sequences that tag 24,202 and 23,009 protein-encoding nuclear gene models in the TAIR6 and EuGène genome annotations, respectively. To cover the remaining untagged genes, we identified 543 additional GSTs using less stringent design criteria and designed 990 sequence tags matching multiple members of gene families (Gene Family Tags or GFTs) to cover any remaining untagged genes. These latter 1,533 features constitute the CATMAv4 addition. CONCLUSION: To update the CATMA GST repertoire, we designed 7,289 additional sequence tags, bringing the total number of tagged TAIR6-annotated Arabidopsis nuclear protein-coding genes to 26,173. This resource is used both for the production of spotted microarrays and the large-scale cloning of hairpin RNA silencing vectors. All information about the resulting updated CATMA repertoire is available through the CATMA database http://www.catma.org.
Resumo:
Despite the increasing popularity of enterprise architecture management (EAM) in practice, many EAM initiatives either do not fully meet the expected targets or fail. Several frameworks have been suggested as guidelines to EA implementation, but companies seldom follow prescriptive frameworks. Instead, they follow very diverse implementation approaches that depend on their organizational contingencies and the way of adopting and evolving EAM over time. This research strives for a broader understanding of EAM by exploring context-dependent EAM adoption approaches as well as identifying the main EA principles that affect EA effectiveness. Based on two studies, this dissertation aims to address two main questions: (1) EAM design: Which approaches do companies follow when adopting EAM? (2) EA principles and their impact: What impact does EA principles have on EA effectiveness/quality? By utilizing both qualitative and quantitative research methods, this research contributes to exploring different EAM designs in different organizational contingencies as well as using EA principles as an effective means to achieve principle-based EAM design. My research can help companies identify a suitable EAM design that fits their organizational settings and shape their EA through a set of principles.
Resumo:
This chapter discusses how the industrial ecological systems can help in dealing with environmental issues in developing countries, and it presents three case studies from India that highlight some of the unique environmental issues of developing world. Industrial ecology explores the assumption that the industrial system can be seen as a certain kind of ecosystem. The scope of industrial ecology goes well beyond waste exchange to the optimization of resources flowing through the economic system. Among the various specific aspects of developing countries, which have to be born in mind, is the fact that the pattern of resource flows in developing countries, and hence, the resultant environmental threat could be very different than what it is in the industrialized west. Typically, the flow of materials through the large, organized manufacturing facilities in the developing countries could be very small in relation to the overall material flow as the small, informal ?industry? plays a key role and forms a very significant portion of the economic activity. The case studies of the Tirupur textile industries, and the leather industry in India, illustrate how redefining the problem from a perspective of resource conservation, and on the basis of resource flow data could point to totally new directions for strategy planning. The case study of the Damodar Valley region amplifies the importance of looking beyond formal industry to solve an environmental problem. It shows that even for globally critical programs, such as climate change program in developing countries, it is just not enough to estimate the emissions from the formal industrial sectors.
Resumo:
The primary mission of Universal Protein Resource (UniProt) is to support biological research by maintaining a stable, comprehensive, fully classified, richly and accurately annotated protein sequence knowledgebase, with extensive cross-references and querying interfaces freely accessible to the scientific community. UniProt is produced by the UniProt Consortium which consists of groups from the European Bioinformatics Institute (EBI), the Swiss Institute of Bioinformatics (SIB) and the Protein Information Resource (PIR). UniProt is comprised of four major components, each optimized for different uses: the UniProt Archive, the UniProt Knowledgebase, the UniProt Reference Clusters and the UniProt Metagenomic and Environmental Sequence Database. UniProt is updated and distributed every 4 weeks and can be accessed online for searches or download at http://www.uniprot.org.
Resumo:
ExPASy (http://www.expasy.org) has worldwide reputation as one of the main bioinformatics resources for proteomics. It has now evolved, becoming an extensible and integrative portal accessing many scientific resources, databases and software tools in different areas of life sciences. Scientists can henceforth access seamlessly a wide range of resources in many different domains, such as proteomics, genomics, phylogeny/evolution, systems biology, population genetics, transcriptomics, etc. The individual resources (databases, web-based and downloadable software tools) are hosted in a 'decentralized' way by different groups of the SIB Swiss Institute of Bioinformatics and partner institutions. Specifically, a single web portal provides a common entry point to a wide range of resources developed and operated by different SIB groups and external institutions. The portal features a search function across 'selected' resources. Additionally, the availability and usage of resources are monitored. The portal is aimed for both expert users and people who are not familiar with a specific domain in life sciences. The new web interface provides, in particular, visual guidance for newcomers to ExPASy.