665 resultados para Constrained network mapping
Resumo:
The application of different EMS current thresholds on muscle activates not only the muscle but also peripheral sensory axons that send proprioceptive and pain signals to the cerebral cortex. A 32-channel time-domain fNIRS instrument was employed to map regional cortical activities under varied EMS current intensities applied on the right wrist extensor muscle. Eight healthy volunteers underwent four EMS at different current thresholds based on their individual maximal tolerated intensity (MTI), i.e., 10 % < 50 % < 100 % < over 100 % MTI. Time courses of the absolute oxygenated and deoxygenated hemoglobin concentrations primarily over the bilateral sensorimotor cortical (SMC) regions were extrapolated, and cortical activation maps were determined by general linear model using the NIRS-SPM software. The stimulation-induced wrist extension paradigm significantly increased activation of the contralateral SMC region according to the EMS intensities, while the ipsilateral SMC region showed no significant changes. This could be due in part to a nociceptive response to the higher EMS current intensities and result also from increased sensorimotor integration in these cortical regions.
Resumo:
Often voltage rise along low voltage (LV) networks limits their capacity to accommodate more renewable energy (RE) sources. This paper proposes a robust and effective approach to coordinate customers' resources and control voltage rise in LV networks, where photovoltaics (PVs) are considered as the RE sources. The proposed coordination algorithm includes both localized and distributed control strategies. The localized strategy determines the value of PV inverter active and reactive power, while the distributed strategy coordinates customers' energy storage units (ESUs). To verify the effectiveness of proposed approach, a typical residential LV network is used and simulated in the PSCAD-EMTC platform.
Resumo:
This thesis examines the use of network governance in US airport transportation planning activities involving taxicab services for airport patrons. The research provides US airports with new insights whereby they can successfully engage with both transportation regulatory agencies and taxicab service providers in developing mutually agreeable policies that foster the development of supply-side taxicab service improvements. A mix of quantitative and qualitative research methods is used to unearth how US airports interact with these actors, and to identify attitudes held by airport staff in their engagements involving airport taxicab planning matters. The research may ultimately lead to the achievement of sustainable increases in the air passenger ground transportation modal share at US airports, resulting in both desirable long-term operational and environmental benefits for airport management, those involved with the provision of airport taxicab services, and the traveling public.
Resumo:
Text categorisation is challenging, due to the complex structure with heterogeneous, changing topics in documents. The performance of text categorisation relies on the quality of samples, effectiveness of document features, and the topic coverage of categories, depending on the employing strategies; supervised or unsupervised; single labelled or multi-labelled. Attempting to deal with these reliability issues in text categorisation, we propose an unsupervised multi-labelled text categorisation approach that maps the local knowledge in documents to global knowledge in a world ontology to optimise categorisation result. The conceptual framework of the approach consists of three modules; pattern mining for feature extraction; feature-subject mapping for categorisation; concept generalisation for optimised categorisation. The approach has been promisingly evaluated by compared with typical text categorisation methods, based on the ground truth encoded by human experts.
Resumo:
This thesis analysed the theoretical and ontological issues of previous scholarship concerning information technology and indigenous people. As an alternative, the thesis used the framework of actor-network-theory, especially through historiographical and ethnographic techniques. The thesis revealed an assemblage of indigenous/digital enactments striving for relevance and avoiding obsolescence. It also recognised heterogeneities- including user-ambivalences, oscillations, noise, non-coherences and disruptions - as part of the milieu of the daily digital lives of indigenous people. By taking heterogeneities into account, the thesis ensured that the data “speaks for itself” and that social inquiry is not overtaken by ideology and ontology.
Resumo:
Two recent decisions of the Supreme Court of New South Wales in the context of obstetric management have highlighted firstly, the importance of keeping legible, accurate and detailed medical records; and secondly, the challenges faced by those seeking to establish causation, particularly where epidemiological evidence is relied upon...
Resumo:
Multiple sclerosis (MS) is a common chronic inflammatory disease of the central nervous system. Susceptibility to the disease is affected by both environmental and genetic factors. Genetic factors include haplotypes in the histocompatibility complex (MHC) and over 50 non-MHC loci reported by genome-wide association studies. Amongst these, we previously reported polymorphisms in chromosome 12q13-14 with a protective effect in individuals of European descent. This locus spans 288 kb and contains 17 genes, including several candidate genes which have potentially significant pathogenic and therapeutic implications. In this study, we aimed to fine-map this locus. We have implemented a two-phase study: a variant discovery phase where we have used next-generation sequencing and two target-enrichment strategies [long-range polymerase chain reaction (PCR) and Nimblegen's solution phase hybridization capture] in pools of 25 samples; and a genotyping phase where we genotyped 712 variants in 3577 healthy controls and 3269 MS patients. This study confirmed the association (rs2069502, P = 9.9 × 10−11, OR = 0.787) and narrowed down the locus of association to an 86.5 kb region. Although the study was unable to pinpoint the key-associated variant, we have identified a 42 (genotyped and imputed) single-nucleotide polymorphism haplotype block likely to harbour the causal variant. No evidence of association at previously reported low-frequency variants in CYP27B1 was observed. As part of the study we compared variant discovery performance using two target-enrichment strategies. We concluded that our pools enriched with Nimblegen's solution phase hybridization capture had better sensitivity to detect true variants than the pools enriched with long-range PCR, whilst specificity was better in the long-range PCR-enriched pools compared with solution phase hybridization capture enriched pools; this result has important implications for the design of future fine-mapping studies.
Resumo:
Recently there has been significant interest of researchers and practitioners on the use of Bluetooth as a complementary transport data. However, literature is limited with the understanding of the Bluetooth MAC Scanner (BMS) based data acquisition process and the properties of the data being collected. This paper first provides an insight on the BMS data acquisition process. Thereafter, it discovers the interesting facts from analysis of the real BMS data from both motorway and arterial networks of Brisbane, Australia. The knowledge gained is helpful for researchers and practitioners to understand the BMS data being collected which is vital to the development of management and control algorithms using the data.
Resumo:
To understand the underlying genetic architecture of cardiovascular disease (CVD) risk traits, we undertook a genome-wide linkage scan to identify CVD quantitative trait loci (QTLs) in 377 individuals from the Norfolk Island population. The central aim of this research focused on the utilization of a genetically and geographically isolated population of individuals from Norfolk Island for the purposes of variance component linkage analysis to identify QTLs involved in CVD risk traits. Substantial evidence supports the involvement of traits such as systolic and diastolic blood pressures, high-density lipoprotein-cholesterol, low-density lipoprotein-cholesterol, body mass index and triglycerides as important risk factors for CVD pathogenesis. In addition to the environmental inXuences of poor diet, reduced physical activity, increasing age, cigarette smoking and alcohol consumption, many studies have illustrated a strong involvement of genetic components in the CVD phenotype through family and twin studies. We undertook a genome scan using 400 markers spaced approximately 10 cM in 600 individuals from Norfolk Island. Genotype data was analyzed using the variance components methods of SOLAR. Our results gave a peak LOD score of 2.01 localizing to chromosome 1p36 for systolic blood pressure and replicated previously implicated loci for other CVD relevant QTLs.
Resumo:
Democratic governments raise taxes and charges and spend revenue on delivering peace, order and good government. The delivery process begins with a legislature as that can provide a framework of legally enforceable rules enacted according to the government’s constitution. These rules confer rights and obligations that allow particular people to carry on particular functions at particular places and times. Metadata standards as applied to public records contain information about the functioning of government as distinct from the non-government sector of society. Metadata standards apply to database construction. Data entry, storage, maintenance, interrogation and retrieval depend on a controlled vocabulary needed to enable accurate retrieval of suitably catalogued records in a global information environment. Queensland’s socioeconomic progress now depends in part on technical efficiency in database construction to address queries about who does what, where and when; under what legally enforceable authority; and how the evidence of those facts is recorded. The Survey and Mapping Infrastructure Act 2003 (Qld) addresses technical aspects of where questions – typically the officially recognised name of a place and a description of its boundaries. The current 10-year review of the Survey and Mapping Regulation 2004 provides a valuable opportunity to consider whether the Regulation makes sense in the context of a number of later laws concerned with management of Public Sector Information (PSI) as well as policies for ICT hardware and software procurement. Removing ambiguities about how official place names are to be regarded on a whole-of-government basis can achieve some short term goals. Longer-term goals depend on a more holistic approach to information management – and current aspirations for more open government and community engagement are unlikely to occur without such a longer-term vision.
Resumo:
Linkage disequilibrium (LD) mapping is commonly used as a fine mapping tool in human genome mapping and has been used with some success for initial disease gene isolation in certain isolated in-bred human populations. An understanding of the population history of domestic dog breeds suggests that LD mapping could be routinely utilized in this species for initial genome-wide scans. Such an approach offers significant advantages over traditional linkage analysis. Here, we demonstrate, using canine copper toxicosis in the Bedlington terrier as the model, that LD mapping could be reasonably expected to be a useful strategy in low-resolution, genome-wide scans in pure-bred dogs. Significant LD was demonstrated over distances up to 33.3 cM. It is very unlikely, for a number of reasons discussed, that this result could be extrapolated to the rest of the genome. It is, however, consistent with the expectation given the population structure of canine breeds and, in this breed at least, with the hypothesis that it may be possible to utilize LD in a genome-wide scan. In this study, LD mapping confirmed the location of the copper toxicosis in Bedlington terrier gene (CT-BT) and was able to do so in a population that was refractory to traditional linkage analysis.
Resumo:
This article asks questions about the futures of power in the network era. Two critical emerging issues are at work with uncertain outcomes. The first is the emergence of the collaborative economy, while the second is the emergence of surveillance capabilities from both civic, state and commercial sources. While both of these emerging issues are expected by many to play an important role in the future development of our societies, it is still unclear whose values and whose purposes will be furthered. This article argues that the futures of these emerging issues depend on contests for power. As such, four scenarios are developed for the futures of power in the network era using the double variable scenario approach.
Resumo:
Network reconfiguration after complete blackout of a power system is an essential step for power system restoration. A new node importance evaluation method is presented based on the concept of regret, and maximisation of the average importance of a path is employed as the objective of finding the optimal restoration path. Then, a two-stage method is presented to optimise the network reconfiguration strategy. Specifically, the restoration sequence of generating units is first optimised so as to maximise the restored generation capacity, then the optimal restoration path is selected to restore the generating nodes concerned and the issues of selecting a serial or parallel restoration mode and the reconnecting failure of a transmission line are next considered. Both the restoration path selection and skeleton-network determination are implemented together in the proposed method, which overcomes the shortcoming of separate decision-making in the existing methods. Finally, the New England 10-unit 39-bus power system and the Guangzhou power system in South China are employed to demonstrate the basic features of the proposed method.
Resumo:
Evidence suggests that both nascent and young firms (henceforth: “new firms”)—despite typically being small and resource-constrained—are sometimes able to innovate effectively. Such firms are seldom able to invest in lengthy and expensive development processes, which suggests that they may frequently rely instead on other pathways to generate innovativeness within the firm. In this paper, we develop and test arguments that “bricolage,” defined as making do by applying combinations of the resources at hand to new problems and opportunities, provides an important pathway to achieve innovation for new resource-constrained firms. Through bricolage, resource-constrained firms engage in the processes of “recombination” that are core to creating innovative outcomes. Based on a large longitudinal dataset, our results suggest that variations in the degree to which firms engage in bricolage behaviors can provide a broadly applicable explanation of innovativeness under resource constraints by new firms. We find no general support for our competing hypothesis that the positive effects may level off or even turn negative at high levels of bricolage..
Resumo:
After first observing a person, the task of person re-identification involves recognising an individual at different locations across a network of cameras at a later time. Traditionally, this task has been performed by first extracting appearance features of an individual and then matching these features to the previous observation. However, identifying an individual based solely on appearance can be ambiguous, particularly when people wear similar clothing (i.e. people dressed in uniforms in sporting and school settings). This task is made more difficult when the resolution of the input image is small as is typically the case in multi-camera networks. To circumvent these issues, we need to use other contextual cues. In this paper, we use "group" information as our contextual feature to aid in the re-identification of a person, which is heavily motivated by the fact that people generally move together as a collective group. To encode group context, we learn a linear mapping function to assign each person to a "role" or position within the group structure. We then combine the appearance and group context cues using a weighted summation. We demonstrate how this improves performance of person re-identification in a sports environment over appearance based-features.