959 resultados para Complexity analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the uplink of a single-cell massive multiple-input multiple-output (MIMO) system with inphase and quadrature-phase imbalance (IQI). This scenario is of particular importance in massive MIMO systems, where the deployment of lower-cost, lower-quality components is desirable to make massive MIMO a viable technology. Particularly, we investigate the effect of IQI on the performance of massive MIMO employing maximum-ratio combining (MRC) receivers. In order to study how IQI affects channel estimation, we derive a new channel estimator for the IQI-impaired model and show that IQI can substantially downgrade the performance of MRC receivers. Moreover, a low-complexity IQI compensation scheme, suitable for massive MIMO, is proposed which is based on the IQI coefficients' estimation and it is independent of the channel gain. The performance of the proposed compensation scheme is analytically evaluated by deriving a tractable approximation of the ergodic achievable rate and providing the asymptotic power scaling laws assuming transmission over Rayleigh fading channels with log-normal large-scale fading. Finally, we show that massive MIMO effectively suppresses the residual IQI effects, as long as, the compensation scheme is applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background
Medical students transitioning into professional practice feel underprepared to deal with the emotional complexities of real-life ethical situations. Simulation-based learning (SBL) may provide a safe environment for students to probe the boundaries of ethical encounters. Published studies of ethics simulation have not generated sufficiently deep accounts of student experience to inform pedagogy. The aim of this study was to understand students’ lived experiences as they engaged with the emotional challenges of managing clinical ethical dilemmas within a SBL environment.

Methods
This qualitative study was underpinned by an interpretivist epistemology. Eight senior medical students participated in an interprofessional ward-based SBL activity incorporating a series of ethically challenging encounters. Each student wore digital video glasses to capture point-of-view (PoV) film footage. Students were interviewed immediately after the simulation and the PoV footage played back to them. Interviews were transcribed verbatim. An interpretative phenomenological approach, using an established template analysis approach, was used to iteratively analyse the data.

Results
Four main themes emerged from the analysis: (1) ‘Authentic on all levels?’, (2)‘Letting the emotions flow’, (3) ‘Ethical alarm bells’ and (4) ‘Voices of children and ghosts’. Students recognised many explicit ethical dilemmas during the SBL activity but had difficulty navigating more subtle ethical and professional boundaries. In emotionally complex situations, instances of moral compromise were observed (such as telling an untruth). Some participants felt unable to raise concerns or challenge unethical behaviour within the scenarios due to prior negative undergraduate experiences.

Conclusions
This study provided deep insights into medical students’ immersive and embodied experiences of ethical reasoning during an authentic SBL activity. By layering on the human dimensions of ethical decision-making, students can understand their personal responses to emotion, complexity and interprofessional working. This could assist them in framing and observing appropriate ethical and professional boundaries and help smooth the transition into clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Qualitative Comparative Analysis (QCA) is a method for the systematic analysis of cases. A holistic view of cases and an approach to causality emphasizing complexity are some of its core features. Over the last decades, QCA has found application in many fields of the social sciences. In spite of this, its use in feminist research has been slower, and only recently QCA has been applied to topics related to social care, the political representation of women, and reproductive politics. In spite of the comparative turn in feminist studies, researchers still privilege qualitative methods, in particular case studies, and are often skeptical of quantitative techniques (Spierings 2012). These studies show that the meaning and measurement of many gender concepts differ across countries and that the factors leading to feminist success and failure are context specific. However, case study analyses struggle to systematically account for the ways in which these forces operate in different locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Typologies have represented an important tool for the development of comparative social policy research and continue to be widely used in spite of growing criticism of their ability to capture the complexity of welfare states and their internal heterogeneity. In particular, debates have focused on the presence of hybrid cases and the existence of distinct cross-national pattern of variation across areas of social policy. There is growing awareness around these issues, but empirical research often still relies on methodologies aimed at classifying countries in a limited number of unambiguous types. This article proposes a two-step approach based on fuzzy-set-ideal-type analysis for the systematic analysis of hybrids at the level of both policies (step 1) and policy configurations or combinations of policies (step 2). This approach is demonstrated by using the case of childcare policies in European economies. In the first step, parental leave policies are analysed using three methods – direct, indirect, and combinatory – to identify and describe specific hybrid forms at the level of policy analysis. In the second step, the analysis focus on the relationship between parental leave and childcare services in order to develop an overall typology of childcare policies, which clearly shows that many countries display characteristics normally associated with different types (hybrids and. Therefore, this two-step approach enhances our ability to account and make sense of hybrid welfare forms produced from tensions and contradictions within and between policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical validity of the claim that overhead costs are driven not by production volume but by transactions resulting from production complexity is examined using data from 32 manufacturing plants from the electronics, machinery, and automobile components industries. Transactions are measured using number of engineering change orders, number of purchasing and production planning personnel, shop- floor area per part, and number of quality control and improvement personnel. Results indicate a strong positive relation between manufacturing overhead costs and both manufacturing transactions and production volume. Most of the variation in overhead costs, however, is explained by measures of manufacturing transactions, not volume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An increasing number of people with terminal cancer are being cared for at home, often by their partner. This study explores the identity, experiences and relationships of people caring for their partner at the end of life and how they construct their experience through personal and couple narratives. It draws upon dialogical approaches to narrative analysis to focus on caring partners and the care relationship. Six participants were recruited for the study. Two methods of data collection are used: narrative interviews and journals. Following individual case analysis, two methods of cross-narrative analysis are used: an analysis of narrative themes and an identification of narrative types. The key findings can be summarised as follows. First, in the period since their partner's terminal prognosis, participants sustained and reconstructed self and couple relationship narratives. These narratives aided the construction of meaning and coherence at a time of major biographical disruption: the anticipated loss of a partner. Second, the study highlights the complexity of spoken and unspoken narratives in terminal cancer and how these relate to individual and couple identities. Third, a typology of archetypal narratives based upon the data is identified. The blow-by-blow narratives illustrate how participants sought to construct coherence and meaning in the illness story, while champion and resilience narratives demonstrate how participants utilised positive self and relational narratives to manage a time of biographical disruption. The study highlights how this narrative approach can enhance understanding of the experiences and identities of people caring for a terminally ill partner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to predict the properties of magnetic materials in a device is essential to ensuring the correct operation and optimization of the design as well as the device behavior over a wide range of input frequencies. Typically, development and simulation of wide-bandwidth models requires detailed, physics-based simulations that utilize significant computational resources. Balancing the trade-offs between model computational overhead and accuracy can be cumbersome, especially when the nonlinear effects of saturation and hysteresis are included in the model. This study focuses on the development of a system for analyzing magnetic devices in cases where model accuracy and computational intensity must be carefully and easily balanced by the engineer. A method for adjusting model complexity and corresponding level of detail while incorporating the nonlinear effects of hysteresis is presented that builds upon recent work in loss analysis and magnetic equivalent circuit (MEC) modeling. The approach utilizes MEC models in conjunction with linearization and model-order reduction techniques to process magnetic devices based on geometry and core type. The validity of steady-state permeability approximations is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To analyze the relationship between pharmacotherapeutical complexity and compliance of therapeutic objectives in HIV+ patients on antiretroviral treatment and concomitant dyslipidemia therapy. Materials and methods: A retrospective observational study including HIV patients on stable antiretroviral treatment during the past 6 months, and dyslipidemia treatment between January and December, 2013. The complexity index was calculated with the tool developed by McDonald et al. Other variables analyzed were: age, gender, risk factor of HIV, smoking, alcoholism and drugs, psychiatric disorders, adherence to antiretroviral treatment and lipid lowering drugs, and clinical parameters (HIV viral load, CD4 count, plasma levels of total cholesterol, LDL, HDL, and triglycerides). In order to determine the predictive factors associated with the compliance of therapeutic objectives, univariate analysis was conducted through logistical regression, followed by a multivariate analysis. Results: The study included 89 patients; 56.8% of them met the therapeutic objectives for dyslipidemia. The complexity index was significantly higher (p = 0.02) in those patients who did not reach the objective values (median 51.8 vs. 38.9). Adherence to lipid lowering treatment was significantly associated with compliance of the therapeutic objectives established for dyslipidemia treatment. A 67.0% of patients met the objectives for their antiretroviral treatment; however, the complexity index was not significantly higher (p = 0.06) in those patients who did not meet said objectives. Conclusions: Pharmacotherapeutical complexity represents a key factor in terms of achieving health objectives in HIV+ patients on treatment for dyslipidemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turnip crinkle virus (TCV) and Pea enation mosaic virus (PEMV) are two positive (+)-strand RNA viruses that are used to investigate the regulation of translation and replication due to their small size and simple genomes. Both viruses contain cap-independent translation elements (CITEs) within their 3´ untranslated regions (UTRs) that fold into tRNA-shaped structures (TSS) according to nuclear magnetic resonance and small angle x-ray scattering analysis (TCV) and computational prediction (PEMV). Specifically, the TCV TSS can directly associate with ribosomes and participates in RNA-dependent RNA polymerase (RdRp) binding. The PEMV kissing-loop TSS (kl-TSS) can simultaneously bind to ribosomes and associate with the 5´ UTR of the viral genome. Mutational analysis and chemical structure probing methods provide great insight into the function and secondary structure of the two 3´ CITEs. However, lack of 3-D structural information has limited our understanding of their functional dynamics. Here, I report the folding dynamics for the TCV TSS using optical tweezers (OT), a single molecule technique. My study of the unfolding/folding pathways for the TCV TSS has provided an unexpected unfolding pathway, confirmed the presence of Ψ3 and hairpin elements, and suggested an interconnection between the hairpins and pseudoknots. In addition, this study has demonstrated the importance of the adjacent upstream adenylate-rich sequence for the formation of H4a/Ψ3 along with the contribution of magnesium to the stability of the TCV TSS. In my second project, I report on the structural analysis of the PEMV kl-TSS using NMR and SAXS. This study has re-confirmed the base-pair pattern for the PEMV kl-TSS and the proposed interaction of the PEMV kl-TSS with its interacting partner, hairpin 5H2. The molecular envelope of the kl-TSS built from SAXS analysis suggests the kl-TSS has two functional conformations, one of which has a different shape from the previously predicted tRNA-shaped form. Along with applying biophysical methods to study the structural folding dynamics of RNAs, I have also developed a technique that improves the production of large quantities of recombinant RNAs in vivo for NMR study. In this project, I report using the wild-type and mutant E.coli strains to produce cost-effective, site-specific labeled, recombinant RNAs. This technique was validated with four representative RNAs of different sizes and complexity to produce milligram amounts of RNAs. The benefit of using site-specific labeled RNAs made from E.coli was demonstrated with several NMR techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The farm-gate value of extensive beef production from the northern Gulf region of Queensland, Australia, is ~$150 million annually. Poor profitability and declining equity are common issues for most beef businesses in the region. The beef industry relies primarily on native pasture systems and studies continue to report a decline in the condition and productivity of important land types in the region. Governments and Natural Resource Management groups are investing significant resources to restore landscape health and productivity. Fundamental community expectations also include broader environmental outcomes such as reducing beef industry greenhouse gas emissions. Whole-of-business analysis results are presented from 18 extensive beef businesses (producers) to highlight the complex social and economic drivers of management decisions that impact on the natural resource and environment. Business analysis activities also focussed on improving enterprise performance. Profitability, herd performance and greenhouse emission benchmarks are documented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les langages de programmation typés dynamiquement tels que JavaScript et Python repoussent la vérification de typage jusqu’au moment de l’exécution. Afin d’optimiser la performance de ces langages, les implémentations de machines virtuelles pour langages dynamiques doivent tenter d’éliminer les tests de typage dynamiques redondants. Cela se fait habituellement en utilisant une analyse d’inférence de types. Cependant, les analyses de ce genre sont souvent coûteuses et impliquent des compromis entre le temps de compilation et la précision des résultats obtenus. Ceci a conduit à la conception d’architectures de VM de plus en plus complexes. Nous proposons le versionnement paresseux de blocs de base, une technique de compilation à la volée simple qui élimine efficacement les tests de typage dynamiques redondants sur les chemins d’exécution critiques. Cette nouvelle approche génère paresseusement des versions spécialisées des blocs de base tout en propageant de l’information de typage contextualisée. Notre technique ne nécessite pas l’utilisation d’analyses de programme coûteuses, n’est pas contrainte par les limitations de précision des analyses d’inférence de types traditionnelles et évite la complexité des techniques d’optimisation spéculatives. Trois extensions sont apportées au versionnement de blocs de base afin de lui donner des capacités d’optimisation interprocédurale. Une première extension lui donne la possibilité de joindre des informations de typage aux propriétés des objets et aux variables globales. Puis, la spécialisation de points d’entrée lui permet de passer de l’information de typage des fonctions appellantes aux fonctions appellées. Finalement, la spécialisation des continuations d’appels permet de transmettre le type des valeurs de retour des fonctions appellées aux appellants sans coût dynamique. Nous démontrons empiriquement que ces extensions permettent au versionnement de blocs de base d’éliminer plus de tests de typage dynamiques que toute analyse d’inférence de typage statique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitrogen (N) is an essential plant nutrient in maize production, and if considering only natural sources, is often the limiting factor world-wide in terms of a plant’s grain yield. For this reason, many farmers around the world supplement available soil N with synthetic man-made forms. Years of over-application of N fertilizer have led to increased N in groundwater and streams due to leaching and run-off from agricultural sites. In the Midwest Corn Belt much of this excess N eventually makes its way to the Gulf of Mexico leading to eutrophication (increase of phytoplankton) and a hypoxic (reduced oxygen) dead zone. Growing concerns about these types of problems and desire for greater input use efficiency have led to demand for crops with improved N use efficiency (NUE) to allow reduced N fertilizer application rates and subsequently lower N pollution. It is well known that roots are responsible for N uptake by plants, but it is relatively unknown how root architecture affects this ability. This research was conducted to better understand the influence of root complexity (RC) in maize on a plant’s response to N stress as well as the influence of RC on other above-ground plant traits. Thirty-one above-ground plant traits were measured for 64 recombinant inbred lines (RILs) from the intermated B73 & Mo17 (IBM) population and their backcrosses (BCs) to either parent, B73 and Mo17, under normal (182 kg N ha-1) and N deficient (0 kg N ha-1) conditions. The RILs were selected based on results from an earlier experiment by Novais et al. (2011) which screened 232 RILs from the IBM to obtain their root complexity measurements. The 64 selected RILs were comprised of 31 of the lowest complexity RILs (RC1) and 33 of the highest complexity RILs (RC2) in terms of root architecture (characterized as fractal dimensions). The use of the parental BCs classifies the experiment as Design III, an experimental design developed by Comstock and Robinson (1952) which allows for estimation of dominance significance and level. Of the 31 traits measured, 12 were whole plant traits chosen due to their documented response to N stress. The other 19 traits were ear traits commonly measured for their influence on yield. Results showed that genotypes from RC1 and RC2 significantly differ for several above-ground phenotypes. We also observed a difference in the number and magnitude of N treatment responses between the two RC classes. Differences in phenotypic trait correlations and their change in response to N were also observed between the RC classes. RC did not seem to have a strong correlation with calculated NUE (ΔYield/ΔN). Quantitative genetic analysis utilizing the Design III experimental design revealed significant dominance effects acting on several traits as well as changes in significance and dominance level between N treatments. Several QTL were mapped for 26 of the 31 traits and significant N effects were observed across the majority of the genome for some N stress indicative traits (e.g. stay-green). This research and related projects are essential to a better understanding of plant N uptake and metabolism. Understanding these processes is a necessary step in the progress towards the goal of breeding for better NUE crops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study is a post-hoc analysis of data from the original randomized control trial of the Play and Language for Autistic Youngsters (PLAY) Home Consultation program, a parent-mediated, DIR/Floortime based early intervention program for children with ASD (Solomon, Van Egeren, Mahone, Huber, & Zimmerman, 2014). We examined 22 children from the original RCT who received the PLAY program. Children were split into two groups (high and lower functioning) based on the ADOS module administered prior to intervention. Fifteen-minute parent-child video sessions were coded through the use of CHILDES transcription software. Child and maternal language, communicative behaviors, and communicative functions were assessed in the natural language samples both pre- and post-intervention. Results demonstrated significant improvements in both child and maternal behaviors following intervention. There was a significant increase in child verbal and non-verbal initiations and verbal responses in whole group analysis. Total number of utterances, word production, and grammatical complexity all significantly improved when viewed across the whole group of participants; however, lexical growth did not reach significance. Changes in child communicative function were especially noteworthy, and demonstrated a significant increase in social interaction and a significant decrease in non-interactive behaviors. Further, mothers demonstrated an increase in responsiveness to the child’s conversational bids, increased ability to follow the child’s lead, and a decrease in directiveness. When separated for analyses within groups, trends emerged for child and maternal variables, suggesting greater gains in use of communicative function in both high and low groups over changes in linguistic structure. Additional analysis also revealed a significant inverse relationship between maternal responsiveness and child non-interactive behaviors; as mothers became more responsive, children’s non-engagement was decreased. Such changes further suggest that changes in learned skills following PLAY parent training may result in improvements in child social interaction and language abilities.