908 resultados para average complexity
Resumo:
This thesis investigates the optimisation of Coarse-Fine (CF) spectrum sensing architectures under a distribution of SNRs for Dynamic Spectrum Access (DSA). Three different detector architectures are investigated: the Coarse-Sorting Fine Detector (CSFD), the Coarse-Deciding Fine Detector (CDFD) and the Hybrid Coarse-Fine Detector (HCFD). To date, the majority of the work on coarse-fine spectrum sensing for cognitive radio has focused on a single value for the SNR. This approach overlooks the key advantage that CF sensing has to offer, namely that high powered signals can be easily detected without extra signal processing. By considering a range of SNR values, the detector can be optimised more effectively and greater performance gains realised. This work considers the optimisation of CF spectrum sensing schemes where the security and performance are treated separately. Instead of optimising system performance at a single, constant, low SNR value, the system instead is optimised for the average operating conditions. The security is still provided such that at the low SNR values the safety specifications are met. By decoupling the security and performance, the system’s average performance increases whilst maintaining the protection of licensed users from harmful interference. The different architectures considered in this thesis are investigated in theory, simulation and physical implementation to provide a complete overview of the performance of each system. This thesis provides a method for estimating SNR distributions which is quick, accurate and relatively low cost. The CSFD is modelled and the characteristic equations are found for the CDFD scheme. The HCFD is introduced and optimisation schemes for all three architectures are proposed. Finally, using the Implementing Radio In Software (IRIS) test-bed to confirm simulation results, CF spectrum sensing is shown to be significantly quicker than naive methods, whilst still meeting the required interference probability rates and not requiring substantial receiver complexity increases.
Resumo:
Background: Antimicrobial resistance is a major public health concern, and its increasing incidence in the Long Term Care Facility (LTCF) setting warrants attention (1). The prescribing of antimicrobials in this setting is often inappropriate and higher in Ireland than the European average (2). The aim of the study was to generate an evidence base for the factors influencing antimicrobial prescribing in LTCFs and to investigate Antimicrobial Stewardship (AMS) strategies for LTCFs. Methods: An initial qualitative study was conducted to determine the factors influencing antimicrobial prescribing in Irish LTCFs. This allowed for the informed implementation of an AMS feasibility study in LTCFs in the greater Cork region. Hospital AMS was also investigated by means of a national survey. A study of LTCF urine sample antimicrobial resistance rates was conducted in order to collate information for incorporation into future LTCF AMS initiatives. Results: The qualitative interviews determined that there are a multitude of factors, unique to the LTCF setting, which influence antimicrobial prescribing. There was a positive response from the doctors and nurses involved in the feasibility study as they welcomed the opportunity to engage with AMS and audit and feedback activities. While the results did not indicate a significant change in antimicrobial prescribing over the study period, important trends and patterns of use were detected. The antimicrobial susceptibility of LTCF urine samples compared to GPs samples found that there was a higher level of antimicrobial resistance in LTCFs. Conclusion: This study has made an important contribution to the development of AMS in LTCFs. The complexity of care and healthcare organisation, and the factors unique to LTCFs must be borne in mind when developing quality improvement strategies.
Resumo:
This longitudinal study tracked third-level French (n=10) and Chinese (n=7) learners of English as a second language (L2) during an eight-month study abroad (SA) period at an Irish university. The investigation sought to determine whether there was a significant relationship between length of stay (LoS) abroad and gains in the learners' oral complexity, accuracy and fluency (CAF), what the relationship was between these three language constructs and whether the two learner groups would experience similar paths to development. Additionally, the study also investigated whether specific reported out-of-class contact with the L2 was implicated in oral CAF gains. Oral data were collected at three equidistant time points; at the beginning of SA (T1), midway through the SA sojourn (T2) and at the end (T3), allowing for a comparison of CAF gains arising during one semester abroad to those arising during a subsequent semester. Data were collected using Sociolinguistic Interviews (Labov, 1984) and adapted versions of the Language Contact Profile (Freed et al., 2004). Overall, the results point to LoS abroad as a highly influential variable in gains to be expected in oral CAF during SA. While one semester in the TL country was not enough to foster statistically significant improvement in any of the CAF measures employed, significant improvement was found during the second semester of SA. Significant differences were also revealed between the two learner groups. Finally, significant correlations, some positive, some negative, were found between gains in CAF and specific usage of the L2. All in all, the disaggregation of the group data clearly illustrates, in line with other recent enquiries (e.g. Wright and Cong, 2014) that each individual learner's path to CAF development was unique and highly individualised, thus providing strong evidence for the recent claim that SLA is "an individualized nonlinear endeavor" (Polat and Kim, 2014: 186).
Resumo:
Recent genomic analyses suggest the importance of combinatorial regulation by broadly expressed transcription factors rather than expression domains characterized by highly specific factors.
Resumo:
BACKGROUND: Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. METHODS AND PRINCIPAL FINDINGS: The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. CONCLUSIONS: Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks.
Resumo:
BACKGROUND: Consent forms have lengthened over time and become harder for participants to understand. We sought to demonstrate the feasibility of creating a simplified consent form for biobanking that comprises the minimum information necessary to meet ethical and regulatory requirements. We then gathered preliminary data concerning its content from hypothetical biobank participants. METHODOLOGY/PRINCIPAL FINDINGS: We followed basic principles of plain-language writing and incorporated into a 2-page form (not including the signature page) those elements of information required by federal regulations and recommended by best practice guidelines for biobanking. We then recruited diabetes patients from community-based practices and randomized half (n = 56) to read the 2-page form, first on paper and then a second time on a tablet computer. Participants were encouraged to use "More information" buttons on the electronic version whenever they had questions or desired further information. These buttons led to a series of "Frequently Asked Questions" (FAQs) that contained additional detailed information. Participants were asked to identify specific sentences in the FAQs they thought would be important if they were considering taking part in a biorepository. On average, participants identified 7 FAQ sentences as important (mean 6.6, SD 14.7, range: 0-71). No one sentence was highlighted by a majority of participants; further, 34 (60.7%) participants did not highlight any FAQ sentences. CONCLUSIONS: Our preliminary findings suggest that our 2-page form contains the information that most prospective participants identify as important. Combining simplified forms with supplemental material for those participants who desire more information could help minimize consent form length and complexity, allowing the most substantively material information to be better highlighted and enabling potential participants to read the form and ask questions more effectively.
Resumo:
Both stimulus and response conflict can disrupt behavior by slowing response times and decreasing accuracy. Although several neural activations have been associated with conflict processing, it is unclear how specific any of these are to the type of stimulus conflict or the amount of response conflict. Here, we recorded electrical brain activity, while manipulating the type of stimulus conflict in the task (spatial [Flanker] versus semantic [Stroop]) and the amount of response conflict (two versus four response choices). Behaviorally, responses were slower to incongruent versus congruent stimuli across all task and response types, along with overall slowing for higher response-mapping complexity. The earliest incongruency-related neural effect was a short-duration frontally-distributed negativity at ~200 ms that was only present in the Flanker spatial-conflict task. At longer latencies, the classic fronto-central incongruency-related negativity 'N(inc)' was observed for all conditions, but was larger and ~100 ms longer in duration with more response options. Further, the onset of the motor-related lateralized readiness potential (LRP) was earlier for the two vs. four response sets, indicating that smaller response sets enabled faster motor-response preparation. The late positive complex (LPC) was present in all conditions except the two-response Stroop task, suggesting this late conflict-related activity is not specifically related to task type or response-mapping complexity. Importantly, across tasks and conditions, the LRP onset at or before the conflict-related N(inc), indicating that motor preparation is a rapid, automatic process that interacts with the conflict-detection processes after it has begun. Together, these data highlight how different conflict-related processes operate in parallel and depend on both the cognitive demands of the task and the number of response options.
Resumo:
We survey recent results on the computational complexity of mixed shop scheduling problems. In a mixed shop, some jobs have fixed machine orders (as in the job shop), while the operations of the other jobs may be processed in arbitrary order (as in the open shop). The main attention is devoted to establishing the boundary between polynomially solvable and NP-hard problems. When the number of operations per job is unlimited, we focus on problems with a fixed number of jobs.
Resumo:
This paper presents a proactive approach to load sharing and describes the architecture of a scheme, Concert, based on this approach. A proactive approach is characterized by a shift of emphasis from reacting to load imbalance to avoiding its occurrence. In contrast, in a reactive load sharing scheme, activity is triggered when a processing node is either overloaded or underloaded. The main drawback of this approach is that a load imbalance is allowed to develop before costly corrective action is taken. Concert is a load sharing scheme for loosely-coupled distributed systems. Under this scheme, load and task behaviour information is collected and cached in advance of when it is needed. Concert uses Linux as a platform for development. Implemented partially in kernel space and partially in user space, it achieves transparency to users and applications whilst keeping the extent of kernel modifications to a minimum. Non-preemptive task transfers are used exclusively, motivated by lower complexity, lower overheads and faster transfers. The goal is to minimize the average response-time of tasks. Concert is compared with other schemes by considering the level of transparency it provides with respect to users, tasks and the underlying operating system.