920 resultados para Sub-registry. Empirical bayesian estimator. General equation. Balancing adjustment factor


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, radars have been used in many applications such as precision agriculture and advanced driver assistant systems. Optimal techniques for the estimation of the number of targets and of their coordinates require solving multidimensional optimization problems entailing huge computational efforts. This has motivated the development of sub-optimal estimation techniques able to achieve good accuracy at a manageable computational cost. Another technical issue in advanced driver assistant systems is the tracking of multiple targets. Even if various filtering techniques have been developed, new efficient and robust algorithms for target tracking can be devised exploiting a probabilistic approach, based on the use of the factor graph and the sum-product algorithm. The two contributions provided by this dissertation are the investigation of the filtering and smoothing problems from a factor graph perspective and the development of efficient algorithms for two and three-dimensional radar imaging. Concerning the first contribution, a new factor graph for filtering is derived and the sum-product rule is applied to this graphical model; this allows to interpret known algorithms and to develop new filtering techniques. Then, a general method, based on graphical modelling, is proposed to derive filtering algorithms that involve a network of interconnected Bayesian filters. Finally, the proposed graphical approach is exploited to devise a new smoothing algorithm. Numerical results for dynamic systems evidence that our algorithms can achieve a better complexity-accuracy tradeoff and tracking capability than other techniques in the literature. Regarding radar imaging, various algorithms are developed for frequency modulated continuous wave radars; these algorithms rely on novel and efficient methods for the detection and estimation of multiple superimposed tones in noise. The accuracy achieved in the presence of multiple closely spaced targets is assessed on the basis of both synthetically generated data and of the measurements acquired through two commercial multiple-input multiple-output radars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cerebral cortex presents self-similarity in a proper interval of spatial scales, a property typical of natural objects exhibiting fractal geometry. Its complexity therefore can be characterized by the value of its fractal dimension (FD). In the computation of this metric, it has usually been employed a frequentist approach to probability, with point estimator methods yielding only the optimal values of the FD. In our study, we aimed at retrieving a more complete evaluation of the FD by utilizing a Bayesian model for the linear regression analysis of the box-counting algorithm. We used T1-weighted MRI data of 86 healthy subjects (age 44.2 ± 17.1 years, mean ± standard deviation, 48% males) in order to gain insights into the confidence of our measure and investigate the relationship between mean Bayesian FD and age. Our approach yielded a stronger and significant (P < .001) correlation between mean Bayesian FD and age as compared to the previous implementation. Thus, our results make us suppose that the Bayesian FD is a more truthful estimation for the fractal dimension of the cerebral cortex compared to the frequentist FD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning is an important sub-field of artificial intelligence (AI) focusing on letting intelligent agents deliberate on the most adequate course of action to attain their goals. Thanks to the recent boost in the number of critical domains and systems which exploit planning for their internal procedures, there is an increasing need for planning systems to become more transparent and trustworthy. Along this line, planning systems are now required to produce not only plans but also explanations about those plans, or the way they were attained. To address this issue, a new research area is emerging in the AI panorama: eXplainable AI (XAI), within which explainable planning (XAIP) is a pivotal sub-field. As a recent domain, XAIP is far from mature. No consensus has been reached in the literature about what explanations are, how they should be computed, and what they should explain in the first place. Furthermore, existing contributions are mostly theoretical, and software implementations are rarely more than preliminary. To overcome such issues, in this thesis we design an explainable planning framework bridging the gap between theoretical contributions from literature and software implementations. More precisely, taking inspiration from the state of the art, we develop a formal model for XAIP, and the software tool enabling its practical exploitation. Accordingly, the contribution of this thesis is four-folded. First, we review the state of the art of XAIP, supplying an outline of its most significant contributions from the literature. We then generalise the aforementioned contributions into a unified model for XAIP, aimed at supporting model-based contrastive explanations. Next, we design and implement an algorithm-agnostic library for XAIP based on our model. Finally, we validate our library from a technological perspective, via an extensive testing suite. Furthermore, we assess its performance and usability through a set of benchmarks and end-to-end examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An emerging technology, that Smart Radio Environments rely on to improve wireless link quality, are Reconfigurable Intelligent Surfaces (RISs). A RIS, in general, can be understood as a thin layer of EM composite material, typically mounted on the walls or ceilings of buildings, which can be reconfigured even after its deployment in the network. RISs made by composing artificial materials in an engineered way, in order to obtain unconventional characteristics, are called metasurfaces. Through the programming of the RIS, it is possible to control and/or modify the radio waves that affect it, thus shaping the radio environment. To overcome the limitations of RISs, the metaprism represents an alternative: it is a passive and non-reconfigurable frequency-selective metasurface that acts as a metamirror to improve the efficiency of the wireless link. In particular, using an OFDM (Orthogonal Frequency-Division Multiplexing) signaling it is possible to control the reflection of the signal, suitably selecting the sub-carrier assigned to each user, without having to interact with the metaprism or having to estimate the CSI. This thesis investigates how OFDM signaling and metaprism can be used for localization purposes, especially to extend the coverage area at low cost, in a scenario where the user is in NLoS (Non-line-of-sight) conditions with respect to the base station, both single antenna. In particular, the paper concerns the design of the analytical model and the corresponding Matlab implementation of a Maximum Likelihood (ML) estimator able to estimate the unknown position, behind an obstacle, from which a generic user transmits to a base station, exploiting the metaprism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few studies have evaluated the profile of use of disease modifying drugs (DMD) in Brazilian patients with spondyloarthritis (SpA). A common research protocol was applied prospectively in 1505 patients classified as SpA by criteria of the European Spondyloarthropathies Study Group (ESSG), followed at 29 referral centers in Rheumatology in Brazil. Demographic and clinical variables were obtained and evaluated, by analyzing their correlation with the use of DMDs methotrexate (MTX) and sulfasalazine (SSZ). At least one DMD was used by 73.6% of patients: MTX by 29.2% and SSZ by 21.7%, while 22.7% used both drugs. The use of MTX was significantly associated with peripheral involvement, and SSZ was associated with axial involvement, and the two drugs were more administered, separately or in combination, in the mixed involvement (p < 0.001). The use of a DMD was significantly associated with Caucasian ethnicity (MTX , p = 0.014), inflammatory back pain (SSZ, p = 0.002) , buttock pain (SSZ, p = 0.030), neck pain (MTX, p = 0.042), arthritis of the lower limbs (MTX, p < 0.001), arthritis of the upper limbs (MTX, p < 0.001), enthesitis (p = 0.007), dactylitis (MTX, p < 0.001), inflammatory bowel disease (SSZ, p < 0.001) and nail involvement (MTX, p < 0.001). The use of at least one DMD was reported by more than 70% of patients in a large cohort of Brazilian patients with SpA, with MTX use more associated with peripheral involvement and the use of SSZ more associated with axial involvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the spatial pattern of ill-defined causes of death across Brazilian regions, and its relationship with the evolution of completeness of the deaths registry and changes in the mortality age profile. We make use of the Brazilian Health Informatics Department mortality database and population censuses from 1980 to 2010. We applied demographic methods to evaluate the quality of mortality data for 137 small areas and correct for under-registration of death counts when necessary. The second part of the analysis uses linear regression models to investigate the relationship between, on the one hand, changes in death counts coverage and age profile of mortality, and on the other, changes in the reporting of ill-defined causes of death. The completeness of death counts coverage increases from about 80% in 1980-1991 to over 95% in 2000-2010 at the same time the percentage of ill-defined causes of deaths reduced about 53% in the country. The analysis suggests that the government's efforts to improve data quality are proving successful, and they will allow for a better understanding of the dynamics of health and the mortality transition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Health economic evaluations require estimates of expected survival from patients receiving different interventions, often over a lifetime. However, data on the patients of interest are typically only available for a much shorter follow-up time, from randomised trials or cohorts. Previous work showed how to use general population mortality to improve extrapolations of the short-term data, assuming a constant additive or multiplicative effect on the hazards for all-cause mortality for study patients relative to the general population. A more plausible assumption may be a constant effect on the hazard for the specific cause of death targeted by the treatments. To address this problem, we use independent parametric survival models for cause-specific mortality among the general population. Because causes of death are unobserved for the patients of interest, a polyhazard model is used to express their all-cause mortality as a sum of latent cause-specific hazards. Assuming proportional cause-specific hazards between the general and study populations then allows us to extrapolate mortality of the patients of interest to the long term. A Bayesian framework is used to jointly model all sources of data. By simulation, we show that ignoring cause-specific hazards leads to biased estimates of mean survival when the proportion of deaths due to the cause of interest changes through time. The methods are applied to an evaluation of implantable cardioverter defibrillators for the prevention of sudden cardiac death among patients with cardiac arrhythmia. After accounting for cause-specific mortality, substantial differences are seen in estimates of life years gained from implantable cardioverter defibrillators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to estimate barite mortar attenuation curves using X-ray spectra weighted by a workload distribution. A semi-empirical model was used for the evaluation of transmission properties of this material. Since ambient dose equivalent, H(⁎)(10), is the radiation quantity adopted by IAEA for dose assessment, the variation of the H(⁎)(10) as a function of barite mortar thickness was calculated using primary experimental spectra. A CdTe detector was used for the measurement of these spectra. The resulting spectra were adopted for estimating the optimized thickness of protective barrier needed for shielding an area in an X-ray imaging facility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Primary X-ray spectra were measured in the range of 80-150kV in order to validate a computer program based on a semiempirical model. The ratio between the characteristic and total air Kerma was considered to compare computed results and experimental data. Results show that the experimental spectra have higher first HVL and mean energy than the calculated ones. The ratios between the characteristic and total air Kerma for calculated spectra are in good agreement with experimental results for all filtrations used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Size distributions in woody plant populations have been used to assess their regeneration status, assuming that size structures with reverse-J shapes represent stable populations. We present an empirical approach of this issue using five woody species from the Cerrado. Considering count data for all plants of these five species over a 12-year period, we analyzed size distribution by: a) plotting frequency distributions and their adjustment to the negative exponential curve and b) calculating the Gini coefficient. To look for a relationship between size structure and future trends, we considered the size structures from the first census year. We analyzed changes in number over time and performed a simple population viability analysis, which gives the mean population growth rate, its variance and the probability of extinction in a given time period. Frequency distributions and the Gini coefficient were not able to predict future trends in population numbers. We recommend that managers should not use measures of size structure as a basis for management decisions without applying more appropriate demographic studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between aggressiveness and peer acceptance-rejection were analyzed in 1281 elementary school children. Sociometric measure was based on three positive and three negative classmates choices. The aggressiveness scales gathered information about familiar, scholar and general situations. There were no statistically insignificant differences among the schools related to the sociometric and aggressiveness in familiar situation measures. The scales of the scholar and general aggressiveness formed two sub-groups, in which two schools showed less aggressiveness and the two other more aggressiveness. Scholar aggressiveness showed significant correlations with the sociometric status in all schools and the general aggressiveness in one of them, suggesting that the most social acceptance, the lesser the student's aggressiveness. As the correlations were low, extreme groups in acceptance-rejection conditions had been studied and the scales of scholar and general aggressiveness had differentiated these groups in only one school.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Handball is a sport that demands endurance associated with fast and powerful actions such as jumps, blocks, sprints and throws. The aim of this study was to evaluate the effects of a 38-week systematic physical training applied to a women's under 21 handball team on upper and lower limb power, 30m sprints speed and endurance. The periodization applied was an adaptation of the Verkhoshansky theory, and aimed at two performance peaks during the season with six data collections. The median and range values for three kg medicine ball throwing was: 2.98m (2.15-3.50); 2.84m (2.43-3.20); 2.90m (2.60-3.38); 3.10 (2.83-3.81); 2.84 (2.55-3.57) and 3.34 (2.93-3.83). Regarding the three-pass running test: 5.60m (4.93-6.58); 5.37m (5.04-6.38); 5.36m (4.93-6.12); 5.65m (4.80-6.78); 5.63m (5.00-6.40) and 5.83m (5.14-6.05). Regarding the 30-m sprint test: 5.8m/s (5.45-6.44); 6,64 m/s (6,24-7,09); 5.65m/s (5.17-5.95); (there was not IV moment for this test); 6.19 m/s (5.57-6.26) and 5.83 (5.14-6.05).Regarding the 30-m sprint endurance test until 10% decrease: 4 sprints (4-6); 5 sprints (4-9); 4,5 sprints (4-16); (there was not IV moment for this test); 6 sprints (4-12) and 5 sprints (4-5). Significant differences (p<0.05) were observed in three kg medicine ball throwing and three-pass running tests at least in one of the performance peak planned, with no significant differences in 30-m sprint speed or endurance tests. The applied physical training was efficient at improving the specific physical fitness in the performance peaks, as well as giving support for better physical training adjustment for the upcoming season.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this article is to introduce elements that allow building an interface between the academic research and the programs of basic education for youngsters and adults. It discusses contributions to these programs that can be found in the results of qualitative research studies. To this end, results of a five-year long project on teacher education are used, which aim was that of analyzing the interaction between teacher and student in youngster and adult literacy classes. The research project was conducted in natural contexts with the purpose of understanding a given social reality, and not of establishing general laws. Therefore, the credibility of its results was built through the observation of multiple contexts, and the gathering of data was made through various methods, from the perspective of several participants observed during a prolonged period of time. This empirical basis was used to evaluate recommendations contained in the report commissioned by Unesco to the International Literacy Institute for presentation at the World Forum on Education held in 2000 in Dakar. This report proposed that the continuous attendance of students to basic education programs is one of the great challenges of the new millennium. With respect to the problem of adult evasion from courses and programs, the article discusses the motivation and accessibility factors, pointed out in official documents as relevant factors to the success or failure of the programs.