894 resultados para Averaging operators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work addresses fundamental issues in the mathematical modelling of the diffusive motion of particles in biological and physiological settings. New mathematical results are proved and implemented in computer models for the colonisation of the embryonic gut by neural cells and the propagation of electrical waves in the heart, offering new insights into the relationships between structure and function. In particular, the thesis focuses on the use of non-local differential operators of non-integer order to capture the main features of diffusion processes occurring in complex spatial structures characterised by high levels of heterogeneity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The field of prognostics has attracted significant interest from the research community in recent times. Prognostics enables the prediction of failures in machines resulting in benefits to plant operators such as shorter downtimes, higher operation reliability, reduced operations and maintenance cost, and more effective maintenance and logistics planning. Prognostic systems have been successfully deployed for the monitoring of relatively simple rotating machines. However, machines and associated systems today are increasingly complex. As such, there is an urgent need to develop prognostic techniques for such complex systems operating in the real world. This review paper focuses on prognostic techniques that can be applied to rotating machinery operating under non-linear and non-stationary conditions. The general concept of these techniques, the pros and cons of applying these methods, as well as their applications in the research field are discussed. Finally, the opportunities and challenges in implementing prognostic systems and developing effective techniques for monitoring machines operating under non-stationary and non-linear conditions are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intermittent generation from wind farms leads to fluctuating power system operating conditions pushing the stability margin to its limits. The traditional way of determining the worst case generation dispatch for a system with several semi-scheduled wind generators yields a conservative solution. This paper proposes a fast estimation of the transient stability margin (TSM) incorporating the uncertainty of wind generation. First, the Kalman filter (KF) is used to provide linear estimation of system angle and then unscented transformation (UT) is used to estimate the distribution of the TSM. The proposed method is compared with the traditional Monte Carlo (MC) method and the effectiveness of the proposed approach is verified using Single Machine Infinite Bus (SMIB) and IEEE 14 generator Australian dynamic system. This method will aid grid operators to perform fast online calculations to estimate TSM distribution of a power system with high levels of intermittent wind generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fluctuations in transit ridership pattern over the year have always concerned transport planners, operators and researchers. Predominantly, metrological elements have been specified to explain variability in ridership volume. However, the outcome of this research points to new direction to explain ridership fluctuation in Brisbane. It explored the relationship between daily bus ridership, seasonality and weather variables for a one-year period, 2012. Rather than segregating the entire year’s ridership into the four calendar seasons (summer, autumn, spring, and winter), this analysis distributed the yearly ridership into nine complex seasonality blocks. These represent calendar season, school/university (academic) period and their corresponding holidays, as well as other observant holidays such as Christmas. The dominance of complex seasonality over typical calendar season was established through analysis and using Multiple Linear Regression (MLR). This research identified a very strong association between complex seasonality and bus ridership. Furthermore, an expectation that Brisbane’s subtropical summer is unfavourable to transit usage was not supported by the findings of this study. A nil association of precipitation and temperature was observed in this region. Finally, this research developed a ridership estimation model, capable of predicting daily ridership within very limited error range. Following the application of this developed model, the estimated annual time series data of each suburb was analysed using Fourier Transformation to appreciate whether any cyclical effects remained, compared with the original data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emergency Response Teams increasingly use interactive technology to help manage information and communications. The challenge is to maintain a high situation awareness for different interactive devices sizes. This research specifically compared a handheld interactive device in the form of an iPad with a large interactive multi-touch tabletop. A search and rescue inspired simulator was designed to test operator situation awareness for the two sized devices. The results show that operators had better situation awareness on the tabletop device when the operation related to detecting of moving targets, searching target locations, distinguishing target types, and comprehending displayed information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because brain structure and function are affected in neurological and psychiatric disorders, it is important to disentangle the sources of variation in these phenotypes. Over the past 15 years, twin studies have found evidence for both genetic and environmental influences on neuroimaging phenotypes, but considerable variation across studies makes it difficult to draw clear conclusions about the relative magnitude of these influences. Here we performed the first meta-analysis of structural MRI data from 48 studies on >1,250 twin pairs, and diffusion tensor imaging data from 10 studies on 444 twin pairs. The proportion of total variance accounted for by genes (A), shared environment (C), and unshared environment (E), was calculated by averaging A, C, and E estimates across studies from independent twin cohorts and weighting by sample size. The results indicated that additive genetic estimates were significantly different from zero for all metaanalyzed phenotypes, with the exception of fractional anisotropy (FA) of the callosal splenium, and cortical thickness (CT) of the uncus, left parahippocampal gyrus, and insula. For many phenotypes there was also a significant influence of C. We now have good estimates of heritability for many regional and lobar CT measures, in addition to the global volumes. Confidence intervals are wide and number of individuals small for many of the other phenotypes. In conclusion, while our meta-analysis shows that imaging measures are strongly influenced by genes, and that novel phenotypes such as CT measures, FA measures, and brain activation measures look especially promising, replication across independent samples and demographic groups is necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past several years, evidence has accumulated showing that the cerebellum plays a significant role in cognitive function. Here we show, in a large genetically informative twin sample (n= 430; aged 16-30. years), that the cerebellum is strongly, and reliably (n=30 rescans), activated during an n-back working memory task, particularly lobules I-IV, VIIa Crus I and II, IX and the vermis. Monozygotic twin correlations for cerebellar activation were generally much larger than dizygotic twin correlations, consistent with genetic influences. Structural equation models showed that up to 65% of the variance in cerebellar activation during working memory is genetic (averaging 34% across significant voxels), most prominently in the lobules VI, and VIIa Crus I, with the remaining variance explained by unique/unshared environmental factors. Heritability estimates for brain activation in the cerebellum agree with those found for working memory activation in the cerebral cortex, even though cerebellar cyto-architecture differs substantially. Phenotypic correlations between BOLD percent signal change in cerebrum and cerebellum were low, and bivariate modeling indicated that genetic influences on the cerebellum are at least partly specific to the cerebellum. Activation on the voxel-level correlated very weakly with cerebellar gray matter volume, suggesting specific genetic influences on the BOLD signal. Heritable signals identified here should facilitate discovery of genetic polymorphisms influencing cerebellar function through genome-wide association studies, to elucidate the genetic liability to brain disorders affecting the cerebellum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a major effort in medical imaging to develop algorithms to extract information from DTI and HARDI, which provide detailed information on brain integrity and connectivity. As the images have recently advanced to provide extraordinarily high angular resolution and spatial detail, including an entire manifold of information at each point in the 3D images, there has been no readily available means to view the results. This impedes developments in HARDI research, which need some method to check the plausibility and validity of image processing operations on HARDI data or to appreciate data features or invariants that might serve as a basis for new directions in image segmentation, registration, and statistics. We present a set of tools to provide interactive display of HARDI data, including both a local rendering application and an off-screen renderer that works with a web-based viewer. Visualizations are presented after registration and averaging of HARDI data from 90 human subjects, revealing important details for which there would be no direct way to appreciate using conventional display of scalar images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population-based brain mapping provides great insight into the trajectory of aging and dementia, as well as brain changes that normally occur over the human life span.We describe three novel brain mapping techniques, cortical thickness mapping, tensor-based morphometry (TBM), and hippocampal surface modeling, which offer enormous power for measuring disease progression in drug trials, and shed light on the neuroscience of brain degeneration in Alzheimer's disease (AD) and mild cognitive impairment (MCI).We report the first time-lapse maps of cortical atrophy spreading dynamically in the living brain, based on averaging data from populations of subjects with Alzheimer's disease and normal subjects imaged longitudinally with MRI. These dynamic sequences show a rapidly advancing wave of cortical atrophy sweeping from limbic and temporal cortices into higher-order association and ultimately primary sensorimotor areas, in a pattern that correlates with cognitive decline. A complementary technique, TBM, reveals the 3D profile of atrophic rates, at each point in the brain. A third technique, hippocampal surface modeling, plots the profile of shape alterations across the hippocampal surface. The three techniques provide moderate to highly automated analyses of images, have been validated on hundreds of scans, and are sensitive to clinically relevant changes in individual patients and groups undergoing different drug treatments. We compare time-lapse maps of AD, MCI, and other dementias, correlate these changes with cognition, and relate them to similar time-lapse maps of childhood development, schizophrenia, and HIV-associated brain degeneration. Strengths and weaknesses of these different imaging measures for basic neuroscience and drug trials are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Concordance is characterised as a negotiation-like health communication approach based on an equal and collaborative partnership between patients and health professionals. The Leeds Attitudes to Concordance II (LATCon II) scale was developed to measure the attitudes towards concordance. The purpose of this study was to translate the LATCon II into Chinese and psychometrically test the Chinese version of LATCon II (C-LATCon II). Methods The study involved three phases: i) translation and cross-cultural adaptation; ii) pilot study, and; iii) a cross-sectional survey (n = 366). Systematic random sampling was used to recruit hypertensive patients from nine communities covering around 78,000 residents in China. Tests of psychometric properties included content validity, construct validity, criteria-related validity (correlation between the C-LATCon II and the Therapeutic Adherence Scale for Hypertensive Patients (TASHP)), internal reliability, and test-retest reliability (n = 30). Results The study found that the C-LATCon II had a satisfactory content validity (item-level Content Validity Index (CVI) = 0.83-1, scale-level CVI/universal agreement = 0.89, and scale-level CVI/averaging calculation = 0.98), construct validity (four components extracted explained 56.66% of the total variance), internal reliability (Cronbach’s alpha of overall scale and four components was 0.78 and 0.66-0.84, respectively), and test-retest reliability (Pearson’s correlation coefficient = 0.82, p < 0.001; interclass correlation coefficient = 0.82, p < 0.001; linear weighted kappa3 statistic for each item = 0.40-0.65, p < 0.05). Criteria-related validity showed a weak association (Pearson’s correlation coefficient = 0.11, p < 0.05) between patients’ attitudes towards concordance during health communication and their health behaviours for hypertension management. Conclusions The C-LATCon II is a validated and reliable instrument which can be used to evaluate the attitudes to concordance in Chinese populations. Four components (health professionals’ attitudes, partnership between two parties, therapeutic decision making, and patients’ involvement) describe the attitudes towards concordance during health communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a geometrically inspired technique for computing Evans functions for the linearised operators about travelling waves. Using the examples of the F-KPP equation and a Keller–Segel model of bacterial chemotaxis, we produce an Evans function which is computable through several orders of magnitude in the spectral parameter and show how such a function can naturally be extended into the continuous spectrum. In both examples, we use this function to numerically verify the absence of eigenvalues in a large region of the right half of the spectral plane. We also include a new proof of spectral stability in the appropriate weighted space of travelling waves of speed c≥sqrt(2δ) in the F-KPP equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pollution on electrical insulators is one of the greatest causes of failure of substations subjected to high levels of salinity and environmental pollution. Considering leakage current as the main indicator of pollution on insulators, this paper focus on establishing the effect of the environmental conditions on the risk of failure due to pollution on insulators and determining the significant change in the magnitude of the pollution on the insulators during dry and humid periods. Hierarchical segmentation analysis was used to establish the effect of environmental conditions on the risk of failure due to pollution on insulators. The Kruskal-Wallis test was utilized to determine the significant changes in the magnitude of the pollution due to climate periods. An important result was the discovery that leakage current was more common on insulators during dry periods than humid ones. There was also a higher risk of failure due to pollution during dry periods. During the humid period, various temperatures and wind directions produced a small change in the risk of failure. As a technical result, operators of electrical substations can now identify the cause of an increase in risk of failure due to pollution in the area. The research provides a contribution towards the behaviour of the leakage current under conditions similar to those of the Colombian Caribbean coast and how they affect the risk of failure of the substation due to pollution.