935 resultados para personal injury resulting from dust-related condition
Resumo:
Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.
Resumo:
For any q > 1, let MOD_q be a quantum gate that determines if the number of 1's in the input is divisible by q. We show that for any q,t > 1, MOD_q is equivalent to MOD_t (up to constant depth). Based on the case q=2, Moore has shown that quantum analogs of AC^(0), ACC[q], and ACC, denoted QAC^(0)_wf, QACC[2], QACC respectively, define the same class of operators, leaving q > 2 as an open question. Our result resolves this question, implying that QAC^(0)_wf = QACC[q] = QACC for all q. We also prove the first upper bounds for QACC in terms of related language classes. We define classes of languages EQACC, NQACC (both for arbitrary complex amplitudes) and BQACC (for rational number amplitudes) and show that they are all contained in TC^(0). To do this, we show that a TC^(0) circuit can keep track of the amplitudes of the state resulting from the application of a QACC operator using a constant width polynomial size tensor sum. In order to accomplish this, we also show that TC^(0) can perform iterated addition and multiplication in certain field extensions.
Resumo:
This thesis critically investigates the divergent international approaches to the legal regulation of the patentability of computer software inventions, with a view to identifying the reforms necessary for a certain, predictable and uniform inter-jurisdictional system of protection. Through a critical analysis of the traditional and contemporary US and European regulatory frameworks of protection for computer software inventions, this thesis demonstrates the confusion and legal uncertainty resulting from ill-defined patent laws and inconsistent patent practices as to the scope of the “patentable subject matter” requirement, further compounded by substantial flaws in the structural configuration of the decision-making procedures within which the patent systems operate. This damaging combination prevents the operation of an accessible and effective Intellectual Property (IP) legal framework of protection for computer software inventions, capable of securing adequate economic returns for inventors whilst preserving the necessary scope for innovation and competition in the field, to the ultimate benefit of society. In exploring the substantive and structural deficiencies in the European and US regulatory frameworks, this thesis develops to ultimately highlight that the best approach to the reform of the legal regulation of software patentability is two-tiered. It demonstrates that any reform to achieve international legal harmony first requires the legislature to individually clarify (Europe) or restate (US) the long-standing inadequate rules governing the scope of software “patentable subject matter”, together with the reorganisation of the unworkable structural configuration of the decision-making procedures. Informed by the critical analysis of the evolution of the “patentable subject matter” requirement for computer software in the US, this thesis particularly considers the potential of the reforms of the European patent system currently underway, to bring about certainty, predictability and uniformity in the legal treatment of computer software inventions.
Resumo:
This paper outlines what we have learned about the impacts of the Deepwater Horizon (DWH) oil disaster from the economics discipline as well as what effect the DWH disaster has had on the economics discipline. It appears that what we know about the economic impact of the DWH spill today is limited, possibly because such analysis is tied up in the federal Natural Resource Damage Assessment (NRDA) process and other state-led efforts. There is evidence, however, that the NRDA process has changed over time to de-emphasize economic valuation of damages. There is also evidence that economists may be producing fewer outputs as a result of the DWH relative to scholars from other disciplines because of an apparent absence of funding for it. Of the research that has taken place, this paper provides a summary and highlights the main directions of future research. It appears that the most pressing topic is addressing the incentives and policies in place to promote a culture of safety in the offshore oil industry. Also, it appears that the most prominent, and challenging, direction of future research resulting from the DWH is the expansion of an ecosystems services approach to damage assessment and marine policy. Lea el abstracto en español 请点击此处阅读中文摘要
Resumo:
Capable of three-dimensional imaging of the cornea with micrometer-scale resolution, spectral domain-optical coherence tomography (SDOCT) offers potential advantages over Placido ring and Scheimpflug photography based systems for accurate extraction of quantitative keratometric parameters. In this work, an SDOCT scanning protocol and motion correction algorithm were implemented to minimize the effects of patient motion during data acquisition. Procedures are described for correction of image data artifacts resulting from 3D refraction of SDOCT light in the cornea and from non-idealities of the scanning system geometry performed as a pre-requisite for accurate parameter extraction. Zernike polynomial 3D reconstruction and a recursive half searching algorithm (RHSA) were implemented to extract clinical keratometric parameters including anterior and posterior radii of curvature, central cornea optical power, central corneal thickness, and thickness maps of the cornea. Accuracy and repeatability of the extracted parameters obtained using a commercial 859nm SDOCT retinal imaging system with a corneal adapter were assessed using a rigid gas permeable (RGP) contact lens as a phantom target. Extraction of these parameters was performed in vivo in 3 patients and compared to commercial Placido topography and Scheimpflug photography systems. The repeatability of SDOCT central corneal power measured in vivo was 0.18 Diopters, and the difference observed between the systems averaged 0.1 Diopters between SDOCT and Scheimpflug photography, and 0.6 Diopters between SDOCT and Placido topography.
Resumo:
Dental microwear researchers consider exogenous grit or dust to be an important cause of microscopic wear on primate teeth. No study to date has examined the accumulation of such abrasives on foods eaten by primates in the forest. This investigation introduces a method to collect dust at various heights in the canopy. Results from dust collection studies conducted at the primate research stations at Ketambe in Indonesia, and Hacienda La Pacifica in Costa Rica indicate that 1) grit collects throughout the canopy in both open country and tropical rain forest environments; and 2) the sizes and concentrations of dust particles accumulated over a fixed period of time differ depending on site location and season of investigation. These results may hold important implications for the interpretation of microwear on primate teeth.
Resumo:
X-ray mammography has been the gold standard for breast imaging for decades, despite the significant limitations posed by the two dimensional (2D) image acquisitions. Difficulty in diagnosing lesions close to the chest wall and axilla, high amount of structural overlap and patient discomfort due to compression are only some of these limitations. To overcome these drawbacks, three dimensional (3D) breast imaging modalities have been developed including dual modality single photon emission computed tomography (SPECT) and computed tomography (CT) systems. This thesis focuses on the development and integration of the next generation of such a device for dedicated breast imaging. The goals of this dissertation work are to: [1] understand and characterize any effects of fully 3-D trajectories on reconstructed image scatter correction, absorbed dose and Hounsifeld Unit accuracy, and [2] design, develop and implement the fully flexible, third generation hybrid SPECT-CT system capable of traversing complex 3D orbits about a pendant breast volume, without interference from the other. Such a system would overcome artifacts resulting from incompletely sampled divergent cone beam imaging schemes and allow imaging closer to the chest wall, which other systems currently under research and development elsewhere cannot achieve.
The dependence of x-ray scatter radiation on object shape, size, material composition and the CT acquisition trajectory, was investigated with a well-established beam stop array (BSA) scatter correction method. While the 2D scatter to primary ratio (SPR) was the main metric used to characterize total system scatter, a new metric called ‘normalized scatter contribution’ was developed to compare the results of scatter correction on 3D reconstructed volumes. Scatter estimation studies were undertaken with a sinusoidal saddle (±15° polar tilt) orbit and a traditional circular (AZOR) orbit. Clinical studies to acquire data for scatter correction were used to evaluate the 2D SPR on a small set of patients scanned with the AZOR orbit. Clinical SPR results showed clear dependence of scatter on breast composition and glandular tissue distribution, otherwise consistent with the overall phantom-based size and density measurements. Additionally, SPR dependence was also observed on the acquisition trajectory where 2D scatter increased with an increase in the polar tilt angle of the system.
The dose delivered by any imaging system is of primary importance from the patient’s point of view, and therefore trajectory related differences in the dose distribution in a target volume were evaluated. Monte Carlo simulations as well as physical measurements using radiochromic film were undertaken using saddle and AZOR orbits. Results illustrated that both orbits deliver comparable dose to the target volume, and only slightly differ in distribution within the volume. Simulations and measurements showed similar results, and all measured dose values were within the standard screening mammography-specific, 6 mGy dose limit, which is used as a benchmark for dose comparisons.
Hounsfield Units (HU) are used clinically in differentiating tissue types in a reconstructed CT image, and therefore the HU accuracy of a system is very important, especially when using non-traditional trajectories. Uniform phantoms filled with various uniform density fluids were used to investigate differences in HU accuracy between saddle and AZOR orbits. Results illustrate the considerably better performance of the saddle orbit, especially close to the chest and nipple region of what would clinically be a pedant breast volume. The AZOR orbit causes shading artifacts near the nipple, due to insufficient sampling, rendering a major portion of the scanned phantom unusable, whereas the saddle orbit performs exceptionally well and provides a tighter distribution of HU values in reconstructed volumes.
Finally, the third generation, fully-suspended SPECT-CT system was designed in and developed in our lab. A novel mechanical method using a linear motor was developed for tilting the CT system. A new x-ray source and a custom made 40 x 30 cm2 detector were integrated on to this system. The SPECT system was nested, in the center of the gantry, orthogonal to the CT source-detector pair. The SPECT system tilts on a goniometer, and the newly developed CT tilting mechanism allows ±15° maximum polar tilting of the CT system. The entire gantry is mounted on a rotation stage, allowing complex arbitrary trajectories for each system, without interference from the other, while having a common field of view. This hybrid system shows potential to be used clinically as a diagnostic tool for dedicated breast imaging.
Resumo:
The study of a score by a serious performer is a fundamental step in the process of arriving at a knowledgeable and deeply informed approach to performing a piece of music. In order to obtain this knowledge numerous aspects of the score must be taken into consideration. It is the intent of this dissertation to gather and analyze the information concerning Naturale, a work written by Luciano Berio in 1985 for viola, percussion and recorded voice, based on Sicilian folk songs. All the aspects surrounding Naturale’s existence are taken into consideration in this study. First, it is important to reflect on Berio’s compositional style and traits, the manner in which he relates his works one to another, what he sees in folk music and his own personal desire to intertwine art music and folk music. For Berio Naturale is not an isolated venture into the realm of mixing folk music and his own avant-garde style; it is instead one of many works resulting from his long-standing relationship with folk music. Another essential aspect in this case is the study of Sicilian folk music itself, and the sources used by Berio to find the songs by which he was inspired. The work is examined section by section with figures showing both excerpts of Naturale as well as the original songs with their translations. An analysis containing harmonic, thematic and formal aspects of the score was developed in order to arrive at a better understanding of the structure and pacing of the piece. For this research the author went to Italy to conduct an interview with Maestro Aldo Bennici, the Sicilian violist for whom Naturale was composed. This interview helped in the discovery of two more songs used by Berio that have not to this point been identified in any other document. Bennici’s outstanding testimony portrayed the expressive character of this music and the evocative imagery behind this score. I hope to bring this knowledge to other performers, that they may fully understand and appreciate the unique beauty and power of Berio’s Naturale.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria, cabin crew training and in post mortuum accident investigation. As the risk of personal injury and costs involved in performing large-scale evacuation experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be high, the development and use of these evacuation modelling tools may become essential if these aircraft are to prove a viable reality. In this paper the capabilities and limitations of the airEXODUS evacuation model are described. Its successful application to the prediction of a recent certification trial, prior to the actual trial taking place, is described. Also described is a newly defined parameter known as OPS which can be used as a measure of evacuation trial optimality. In addition, sample evacuation simulations in the presence of fire atmospheres are described. Finally, the data requiremnets of the airEXODUS evacuation model is discussed along with several projects currently underway at the the Univesity of Greenwich designed to obtain this data. Included in this discussion is a description of the AASK - Aircraft Accident Statistics and Knowledge - data base which contains detailed information from aircraft accident survivors.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria and in post mortuuum accident investigation. As the risk of personal injury and costs involved in performing large-scale evacuation experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be high, the development and use of these evacuation modelling tools may become essential if these aircraft are to prove a viable reality. In this paper the capabilities and limitation of the air-EXODUS evacuation model are described. Its successful application to the prediction of a recent certificaiton trial, prior to the actual trial taking place, is described. Also described is a newly defined parameter known as OPS which can be used as a measure of evacuation trial optimality. Finally, the data requirements of aircraft evacuation models is discussed along with several projects currently underway at the University of Greenwich designed to obtain this data. Included in this discussion is a description of the AASK - Aircraft Accident Statistics and Knowledge - data base which contains detailed information from aircraft accident survivors.
Resumo:
Urban spectacles such as the Olympic Games have been long perceived as being able to impose desired effects in the city that act as host. This kind of urban boost may include the creation of new jobs and revenue for local community, growth in tourism and convention business, improvements to city infrastructure and environment, and the stimulation of broad reform in the social, political and institutional realm. Nevertheless at the other end of the debate, the potentially detrimental impacts of Olympic urban development, particularly on disadvantaged and vulnerable groups, have also been increasingly noticed in recent years and subsequently cited by a number of high profile anti-Olympic groups to campaign against Olympic bids and awards. The common areas of concern over Olympic-related projects include the cost and debts risk, environmental threat, the occurrence of social imbalance, and disruption and disturbance of existing community life. Among these issues, displacement of low income households and squatter communities resulting from Olympic-inspired urban renewal are comparatively under-explored and have emerged as an imperative area for research inquiry. This is particularly the case where many other problems have become less prominent. Changing a city’s demographic landscape, particularly displacing lower income people from the area proposed for a profitable development is a highly contentious matter in its own right. Some see it as a natural and inevitable outgrowth of the process of urban evolution, without which cities cannot move towards a more attractive location for consumption-based business. Others believe it reflects urban crises and conflicts, highlighting the market failures, polarization and injustice. Regardless of perception,these phenomena are visible everywhere in post-industrial cities and particularly cannot be ignored when planning for the Olympic Games and other mega-events. The aim of this paper is to start the process of placing the displacement issue in the context of Olympic preparation and to seek a better understanding of their interrelations. In order to develop a better understanding of this issue in terms of cause, process, influential factors and its implication on planning policy, this paper studies the topic from both theoretic and empirical angles. It portrays various situations where the Olympics may trigger or facilitate displacement in host cities during the preparation of the Games, identifies several major variables that may affect the process and the overall outcome, and explores what could be learnt in generic terms for planning Olympic oriented infrastructure so that ill-effects to the local community can be effectively controlled. The paper concludes that the selection of development sites, the integration of Olympic facilities with the city’s fabric, the diversity of housing type produced for local residents and the dynamics of the new socioeconomic structure.
Resumo:
We report evidences that the zooplankton biomass in the tropical Atlantic has declined with an almost 10-fold drop from the 1950s to 2000. The results of the multiple regression analysis showed that the decline in zooplankton biomass was positively related to the NAO-index and to phosphate concentration. We also found that the depth of the thermocline has decreased over the period of our investigation. Thus, the decline we report in zooplankton biomass may be related to the combined effect of two phenomena driven by global temperature increase: (1) the widening of the distributional range of tropical species due to the expansion of the ‘tropical belt’ and (2) a decrease in primary production resulting from the thinning of the thermocline. The decline of zooplankton biomass we report suggests that global warming of the ocean may be altering tropical food webs, and through them, it may also indirectly impact tropical oceans biogeochemical cycles.
Resumo:
ABSTRACT: Oceanographic fronts are physical interfaces between water masses that differ in properties such as temperature, salinity, turbidity and chl a enrichment. Bio-physical coupling along fronts can lead to the development of pelagic biodiversity hotspots. A diverse range of marine vertebrates have been shown to associate with fronts, using them as foraging and migration habitats. Elucidation of the ecological significance of fronts generates a better understanding of marine ecosystem functioning, conferring opportunities to improve management of anthropogenic activities in the oceans. This study presents novel insight into the oceanographic drivers of habitat use in a population of marine turtles characterised by an oceanic-neritic foraging dichotomy. Using satellite tracking data from adult female loggerhead turtles nesting at Cape Verde (n = 12), we test the hypothesis that oceanic-foraging loggerheads associate with mesocale (10s – to 100s of km) thermal fronts. We use high-resolution (1 km) composite front mapping to characterise frontal activity in the Canary Current Large Marine Ecosystem (LME) over 2 temporal scales: (1) seasonal front frequency and (2) 7-day front metrics. Our use-availability analysis indicates that oceanic loggerheads show a preference for the highly productive upwelling region between Cape Verde and mainland Africa, an area of intense frontal activity. Within the upwelling region, turtles appear to forage epipelagically around mesoscale thermal fronts, exploiting profitable foraging opportunities resulting from physical aggregation of prey.
Resumo:
ABSTRACT: Oceanographic fronts are physical interfaces between water masses that differ in properties such as temperature, salinity, turbidity and chl a enrichment. Bio-physical coupling along fronts can lead to the development of pelagic biodiversity hotspots. A diverse range of marine vertebrates have been shown to associate with fronts, using them as foraging and migration habitats. Elucidation of the ecological significance of fronts generates a better understanding of marine ecosystem functioning, conferring opportunities to improve management of anthropogenic activities in the oceans. This study presents novel insight into the oceanographic drivers of habitat use in a population of marine turtles characterised by an oceanic-neritic foraging dichotomy. Using satellite tracking data from adult female loggerhead turtles nesting at Cape Verde (n = 12), we test the hypothesis that oceanic-foraging loggerheads associate with mesocale (10s – to 100s of km) thermal fronts. We use high-resolution (1 km) composite front mapping to characterise frontal activity in the Canary Current Large Marine Ecosystem (LME) over 2 temporal scales: (1) seasonal front frequency and (2) 7-day front metrics. Our use-availability analysis indicates that oceanic loggerheads show a preference for the highly productive upwelling region between Cape Verde and mainland Africa, an area of intense frontal activity. Within the upwelling region, turtles appear to forage epipelagically around mesoscale thermal fronts, exploiting profitable foraging opportunities resulting from physical aggregation of prey.
Resumo:
Macroalgae (seaweeds) are a promising feedstock for the production of third generation bioethanol, since they have high carbohydrate contents, contain little or no lignin and are available in abundance. However, seaweeds typically contain a more diverse array of monomeric sugars than are commonly present in feedstocks derived from lignocellulosic material which are currently used for bioethanol production. Hence, identification of a suitable fermentative microorganism that can utilise the principal sugars released from the hydrolysis of macroalgae remains a major objective. The present study used a phenotypic microarray technique to screen 24 different yeast strains for their ability to metabolise individual monosaccharides commonly found in seaweeds, as well as hydrolysates following an acid pre-treatment of five native UK seaweed species (Laminaria digitata, Fucus serratus, Chondrus crispus, Palmaria palmata and Ulva lactuca). Five strains of yeast (three Saccharomyces spp, one Pichia sp and one Candida sp) were selected and subsequently evaluated for bioethanol production during fermentation of the hydrolysates. Four out of the five selected strains converted these monomeric sugars into bioethanol, with the highest ethanol yield (13 g L−1) resulting from a fermentation using C. crispus hydrolysate with Saccharomyces cerevisiae YPS128. This study demonstrated the novel application of a phenotypic microarray technique to screen for yeast capable of metabolising sugars present in seaweed hydrolysates; however, metabolic activity did not always imply fermentative production of ethanol.