873 resultados para Functions of complex variables.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trehalose is a non-reducing disaccharide essential for pathogenic fungal survival and virulence. The biosynthesis of trehalose requires the trehalose-6-phosphate synthase, Tps1, and trehalose-6-phosphate phosphatase, Tps2. More importantly, the trehalose biosynthetic pathway is absent in mammals, conferring this pathway as an ideal target for antifungal drug design. However, lack of germane biochemical and structural information hinders antifungal drug design against these targets.

In this dissertation, macromolecular X-ray crystallography and biochemical assays were employed to understand the structures and functions of proteins involved in the trehalose biosynthetic pathway. I report here the first eukaryotic Tps1 structures from Candida albicans (C. albicans) and Aspergillus fumigatus (A. fumigatus) with substrates or substrate analogs. These structures reveal the key residues involved in substrate binding and catalysis. Subsequent enzymatic assays and cellular assays highlight the significance of these key Tps1 residues in enzyme function and fungal stress response. The Tps1 structure captured in its transition-state with a non-hydrolysable inhibitor demonstrates that Tps1 adopts an “internal return like” mechanism for catalysis. Furthermore, disruption of the trehalose biosynthetic complex formation through abolishing Tps1 dimerization reveals that complex formation has regulatory function in addition to trehalose production, providing additional targets for antifungal drug intervention.

I also present here the structure of the Tps2 N-terminal domain (Tps2NTD) from C. albicans, which may be involved in the proper formation of the trehalose biosynthetic complex. Deletion of the Tps2NTD results in a temperature sensitive phenotype. Further, I describe in this dissertation the structures of the Tps2 phosphatase domain (Tps2PD) from C. albicans, A. fumigatus and Cryptococcus neoformans (C. neoformans) in multiple conformational states. The structures of the C. albicans Tps2PD -BeF3-trehalose complex and C. neoformans Tps2PD(D24N)-T6P complex reveal extensive interactions between both glucose moieties of the trehalose involving all eight hydroxyl groups and multiple residues of both the cap and core domains of Tps2PD. These structures also reveal that steric hindrance is a key underlying factor for the exquisite substrate specificity of Tps2PD. In addition, the structures of Tps2PD in the open conformation provide direct visualization of the conformational changes of this domain that are effected by substrate binding and product release.

Last, I present the structure of the C. albicans trehalose synthase regulatory protein (Tps3) pseudo-phosphatase domain (Tps3PPD) structure. Tps3PPD adopts a haloacid dehydrogenase superfamily (HADSF) phosphatase fold with a core Rossmann-fold domain and a α/β fold cap domain. Despite lack of phosphatase activity, the cleft between the Tps3PPD core domain and cap domain presents a binding pocket for a yet uncharacterized ligand. Identification of this ligand could reveal the cellular function of Tps3 and any interconnection of the trehalose biosynthetic pathway with other cellular metabolic pathways.

Combined, these structures together with significant biochemical analyses advance our understanding of the proteins responsible for trehalose biosynthesis. These structures are ready to be exploited to rationally design or optimize inhibitors of the trehalose biosynthetic pathway enzymes. Hence, the work described in this thesis has laid the groundwork for the design of Tps1 and Tps2 specific inhibitors, which ultimately could lead to novel therapeutics to treat fungal infections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis I examine a variety of linguistic elements which involve ``alternative'' semantic values---a class arguably including focus, interrogatives, indefinites, and disjunctions---and the connections between these elements. This study focusses on the analysis of such elements in Sinhala, with comparison to Malayalam, Tlingit, and Japanese. The central part of the study concerns the proper syntactic and semantic analysis of Q[uestion]-particles (including Sinhala "da", Malayalam "-oo", Japanese "ka"), which, in many languages, appear not only in interrogatives, but also in the formation of indefinites, disjunctions, and relative clauses. This set of contexts is syntactically-heterogeneous, and so syntax does not offer an explanation for the appearance of Q-particles in this particular set of environments. I propose that these contexts can be united in terms of semantics, as all involving some element which denotes a set of ``alternatives''. Both wh-words and disjunctions can be analysed as creating Hamblin-type sets of ``alternatives''. Q-particles can be treated as uniformly denoting variables over choice functions which apply to the aforementioned Hamblin-type sets, thus ``restoring'' the derivation to normal Montagovian semantics. The treatment of Q-particles as uniformly denoting variables over choice functions provides an explanation for why these particles appear in just this set of contexts: they all include an element with Hamblin-type semantics. However, we also find variation in the use of Q-particles; including, in some languages, the appearance of multiple morphologically-distinct Q-particles in different syntactic contexts. Such variation can be handled largely by positing that Q-particles may vary in their formal syntactic feature specifications, determining which syntactic contexts they are licensed in. The unified analysis of Q-particles as denoting variables over choice functions also raises various questions about the proper analysis of interrogatives, indefinites, and disjunctions, including issues concerning the nature of the semantics of wh-words and the syntactic structure of disjunction. As well, I observe that indefinites involving Q-particles have a crosslinguistic tendency to be epistemic indefinites, i.e. indefinites which explicitly signal ignorance of details regarding who or what satisfies the existential claim. I provide an account of such indefinites which draws on the analysis of Q-particles as variables over choice functions. These pragmatic ``signals of ignorance'' (which I argue to be presuppositions) also have a further role to play in determining the distribution of Q-particles in disjunctions. The final section of this study investigates the historical development of focus constructions and Q-particles in Sinhala. This diachronic study allows us not only to observe the origin and development of such elements, but also serves to delimit the range of possible synchronic analyses, thus providing us with further insights into the formal syntactic and semantic properties of Q-particles. This study highlights both the importance of considering various components of the grammar (e.g. syntax, semantics, pragmatics, morphology) and the use of philology in developing plausible formal analyses of complex linguistic phenomena such as the crosslinguistic distribution of Q-particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complex functions, generally feature some interesting peculiarities, seen as extensions real functions, complementing the study of real analysis. However, the visualization of some complex functions properties requires the simultaneous visualization of two-dimensional spaces. The multiple Windows of GeoGebra, combined with its ability of algebraic computation with complex numbers, allow the study of the functions defined from ℂ to ℂ through traditional techniques and by the use of Domain Colouring. Here, we will show how we can use GeoGebra for the study of complex functions, using several representations and creating tools which complement the tools already provided by the software. Our proposals designed for students of the first year of engineering and science courses can and should be used as an educational tool in collaborative learning environments. The main advantage in its use in individual terms is the promotion of the deductive reasoning (conjecture / proof). In performed the literature review few references were found involving this educational topic and by the use of a single software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The ageing population, with concomitant increase in chronic conditions, is increasing the presence of older people with complex needs in hospital. People with dementia are one of these complex populations and are particularly vulnerable to complications in hospital. Registered nurses can offer simultaneous assessment and intervention to prevent or mitigate hospital-acquired complications through their skilled brokerage between patient needs and hospital functions. A range of patient outcome measures that are sensitive to nursing care has been tested in nursing work environments across the world. However, none of these measures have focused on hospitalised older patients. Method: This thesis explores nursing-sensitive complications for older patients with and without dementia using an internationally recognised, risk-adjusted patient outcome approach. Specifically explored are: the differences between rates of complications; the costs of complications; and cost comparisons of patient complexity. A retrospective cohort study of an Australian state’s 2006–07 public hospital discharge data was utilised to identify patient episodes for people over age 50 (N=222,440) where dementia was identified as a primary or secondary diagnosis (N=44,422). Extra costs for patient episodes were estimated based on length of stay (LOS) above the average for each patient’s Diagnosis Related Group (DRG) (N=157,178) and were modelled using linear regression analysis to establish the strongest patient complexity predictors of cost. Results: Hospitalised patients with a primary or secondary diagnosis of dementia had higher rates of complications than did their same-age peers. The highest rates and relative risk for people with dementia were found in four key complications: urinary tract infections; pressure injuries; pneumonia, and delirium. While 21.9% of dementia patients (9,751/44,488, p<0.0001) suffered a complication, only 8.8% of non-dementia patients did so (33,501/381,788, p<0.0001), giving dementia patients a 2.5 relative risk of acquiring a complication (p<0.0001). These four key complications in patients over 50 both with and without dementia were associated with an eightfold increase in length of stay (813%, or 3.6 days/0.4 days) and double the increased estimated mean episode cost (199%, or A$16,403/ A$8,240). These four complications were associated with 24.7% of the estimated cost of additional days spent in hospital in 2006–07 in NSW (A$226million/A$914million). Dementia patients accounted for 22.0% of these costs (A$49million/A$226million) even though they were only 10.4% of the population (44,488/426,276 episodes). Hospital-acquired complications, particularly for people with a comorbidity of dementia, cost more than other kinds of inpatient complexity but admission severity was a better predictor of excess cost. Discussion: Four key complications occur more often in older patients with dementia and the high rate of these complications makes them expensive. These complications are potentially preventable. However, the care that can prevent them (such as mobility, hydration, nutrition and communication) is known to be rationed or left unfinished by nurses. Older hospitalised people who have complex needs, such as those with dementia, are more likely to experience care rationing as their care tends to take longer, be less predictable and less curative in nature. This thesis offers the theoretical proposition that evidence-based nursing practices are rationed for complex older patients and that this rationed care contributes to functional and cognitive decline during hospitalisation. This, in turn, contributes to the high rates of complications observed. Thus four key complications can be seen as a ‘Failure to Maintain’ complex older people in hospital. ‘Failure to Maintain’ is the inadequate delivery of essential functional and cognitive care for a complex older person in hospital resulting in a complication, and is recommended as a useful indicator for hospital quality. Conclusions: When examining extra length of stay in hospital, complications and comorbid dementia are costly. Complications are potentially preventable, and dementia care in hospitals can be improved. Hospitals and governments looking to decrease costs can engage in risk-reduction strategies for common nurse sensitive complications such as healthy nursing work environments that minimise nurses’ rationing of functional and cognitive care. The conceptualisation of complex older patients as ‘business as usual’ rather than a ‘burden’ is likely necessary for sustainable health care services of the future. The use of the ‘Failure to Maintain’ indicators at institution and state levels may aid in embedding this approach for complex older patients into health organisations. Ongoing investigation is warranted into the relationships between the largest health services expense (hospitals), the largest hospital population (complex older patients), and the largest hospital expense (nurses). The ‘Failure to Maintain’ quality indicator makes a useful and substantive contribution to further clinical, administrative and research developments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Complex chronic diseases are a challenge for the current configuration of Health services. Case management is a service frequently provided for people with chronic conditions and despite its effectiveness in many outcomes, such as mortality or readmissions, uncertainty remains about the most effective form of team organization, structures, and the nature of the interventions. Many processes and outcomes of case management for people with complex chronic conditions cannot be addressed with the information provided by electronic clinical records. Registries are frequently used to deal with this weakness. The aim of this study was to generate a registry-based information system of patients receiving case management to identify their clinical characteristics, their context of care, events identified during their follow-up, interventions developed by case managers, and services used. Methods and design: The study was divided into three phases, covering the detection of information needs, the design and its implementation in the healthcare system, using literature review and expert consensus methods to select variables that would be included in the registry. Objective: To describe the essential characteristics of the provision of ca re lo people who receive case management (structure, process and outcomes), with special emphasis on those with complex chronic diseases. Study population: Patients from any District of Primary Care, who initiate the utilization of case management services, to avoid information bias that may occur when including subjects who have already been received the service, and whose outcomes and characteristics could not be properly collected. Results: A total of 102 variables representing structure, processes and outcomes of case management were selected for their inclusion in the registry after the consensus phase. Total sample was composed of 427 patients, of which 211 (49.4%) were women and 216 (50.6%) were men. The average functional level (Barthel lndex) was 36.18 (SD 29.02), cognitive function (Pfeiffer) showed an average of 4.37 {SD 6.57), Chat1son Comorbidity lndex, obtained a mean of 3.03 (SD 2.7) and Social Support (Duke lndex) was 34.2 % (SD 17.57). More than half of patients include in the Registry, correspond lo immobilized or transitional care for patients discharged from hospital (66.5 %). The patient's educational level was low or very low (50.4%). Caregivers overstrain (Caregiver stress index), obtained an average value of 6.09% (SD 3.53). Only 1.2 % of patients had declared their advanced directives, 58.6 had not defined the tutelage and the vast majority lived at home 98.8 %. Regarding the major events recorded at RANGE Registry, 25.8 % of the selected patients died in the first three months, 8.2 % suffered a hospital admission at least once time, 2.3%, two times, and 1.2% three times, 7.5% suffered a fall, 8.7% had pressure ulcer, 4.7% had problems with medication, and 3.3 % were institutionalized. Stroke is the more prevalent health problem recorded (25.1%), followed by hypertension (11.1%) and COPD (11.1%). Patients registered by NCMs had as main processes diabetes (16.8%) and dementia (11.3 %). The most frequent nursing diagnoses referred to the self-care deficit in various activities of daily living. Regarding to nursing interventions, described by the Nursing Intervention Classification (NIC), dementia management is the most used intervention, followed by mutual goal setting, caregiver and emotional support. Conclusions: The patient profile who receive case management services is a chronic complex patient with severe dependence, cognitive impairment, normal social support, low educational level, health problems such as stroke, hypertension or COPD, diabetes or dementia, and has an informal caregiver. At the first follow up, mortality was 19.2%, and a discrete rate of readmissions and falls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous work by Professor John Frazer on Evolutionary Architecture provides a basis for the development of a system evolving architectural envelopes in a generic and abstract manner. Recent research by the authors has focused on the implementation of a virtual environment for the automatic generation and exploration of complex forms and architectural envelopes based on solid modelling techniques and the integration of evolutionary algorithms, enhanced computational and mathematical models. Abstract data types are introduced for genotypes in a genetic algorithm order to develop complex models using generative and evolutionary computing techniques. Multi-objective optimisation techniques are employed for defining the fitness function in the evaluation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the study of complex neurobiological movement systems, measurement indeterminacy has typically been overcome by imposing artificial modelling constraints to reduce the number of unknowns (e.g., reducing all muscle, bone and ligament forces crossing a joint to a single vector). However, this approach prevents human movement scientists from investigating more fully the role, functionality and ubiquity of coordinative structures or functional motor synergies. Advancements in measurement methods and analysis techniques are required if the contribution of individual component parts or degrees of freedom of these task-specific structural units is to be established, thereby effectively solving the indeterminacy problem by reducing the number of unknowns. A further benefit of establishing more of the unknowns is that human movement scientists will be able to gain greater insight into ubiquitous processes of physical self-organising that underpin the formation of coordinative structures and the confluence of organismic, environmental and task constraints that determine the exact morphology of these special-purpose devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Local climate is a critical element in the design of energy efficient buildings. In this paper, ten years of historical weather data in Australia's eight capital cities were profiled and analysed to characterize the variations of climatic variables in Australia. The method of descriptive statistics was employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are presented. It was found that although weather variables vary with different locations, there is often a good, nearly linear relation between a weather variable and its cumulative percentage for the majority of middle part of the cumulative curves. By comparing the slopes of these distribution profiles, it may be possible to determine the relative range of changes of the particular weather variables for a given city. The implications of these distribution profiles of key weather variables on energy efficient building design are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper argues a model of complex system design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of the efficient use of energy and material resource in life-cycle of buildings, the active involvement of the occupants in micro-climate control within buildings, and the natural environmental context. The interactions of the parameters compose a complex system of sustainable architectural design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The complexity theory of dissipative structure states a microscopic formulation of open system evolution, which provides a system design framework for the evolution of building environmental performance towards an optimization of sustainability in architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing complexity of modern day threats and the growing sophistication of interlinked and interdependent operating environments, Business Continuity Management (BCM) has emerged as a new discipline, offering a strategic approach to safeguarding organisational functions. Of significant interest is the application of BCM frameworks and strategies within critical infrastructure, and in particular the aviation industry. Given the increased focus on security and safety for critical infrastructures, research into the adoption of BCM principles within an airport environment provides valuable management outcomes and research into a previously neglected area of inquisition. This research has used a single case study methodology to identify possible impediments to BCM adoption and implementation by the Brisbane Airport Corporation (BAC). It has identified a number of misalignments between the required breadth of focus for a BCM program, identified differing views on specific roles and responsibilities required during a major disruptive event and illustrated the complexities of the Brisbane Airport which impede the understanding and implementation of effective Business Continuity Management Strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is primarily produced by the microbially-mediated nitrification and denitrification processes in soils. It is influenced by a suite of climate (i.e. temperature and rainfall) and soil (physical and chemical) variables, interacting soil and plant nitrogen (N) transformations (either competing or supplying substrates) as well as land management practices. It is not surprising that N2O emissions are highly variable both spatially and temporally. Computer simulation models, which can integrate all of these variables, are required for the complex task of providing quantitative determinations of N2O emissions. Numerous simulation models have been developed to predict N2O production. Each model has its own philosophy in constructing simulation components as well as performance strengths. The models range from those that attempt to comprehensively simulate all soil processes to more empirical approaches requiring minimal input data. These N2O simulation models can be classified into three categories: laboratory, field and regional/global levels. Process-based field-scale N2O simulation models, which simulate whole agroecosystems and can be used to develop N2O mitigation measures, are the most widely used. The current challenge is how to scale up the relatively more robust field-scale model to catchment, regional and national scales. This paper reviews the development history, main construction components, strengths, limitations and applications of N2O emissions models, which have been published in the literature. The three scale levels are considered and the current knowledge gaps and challenges in modelling N2O emissions from soils are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros