852 resultados para Initial data problem
Resumo:
This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach
Resumo:
Majolica pottery was the most characteristic tableware produced in Europe during the Medieval and Renaissance periods. Because of the prestige and importance attributed to this ware, Spanish majolica was imported in vast quantities into the Americas during the Spanish Colonial period. A study of Spanish majolica was conducted on a set of 186 samples from the 10 primary majolica production centres on the Iberian Peninsula and 22 sherds from two early colonial archaeological sites on the Canary Islands. The samples were analysed by neutron activation analysis (NAA), and the resulting data were interpreted using an array of multivariate statistical approaches. Our results show a clear discrimination between different production centres, allowing a reliable provenance attribution of the sherds from the Canary Islands.
Resumo:
The Catalan Research Portal (Portal de la Recerca de Catalunya or PRC) is an initiative carried out by the Consortium for University Services in Catalonia (CSUC) in coordination with nearly all universities in Catalonia. The Portal will provide an online CERIF-compliant collection of all research outputs produced by Catalan HEIs together with an appropriate contextual information describing the specific environment where the output was generated (such as researchers, research group, research project, etc). The initial emphasis of the Catalan Research Portal approach to research outputs will be made on publications, but other outputs such as patents and eventually research data will eventually be addressed as well. These guidelines provide information for PRC data providers to expose and exchange their research information metadata in CERIFXML compatible structure, thus allowing them not just to exchange validated CERIF XML data with the PRC platform, but to improve their general interoperability by being able to deliver CERIFcompatible outputs.
Resumo:
This paper sets out to identify the initial positions of the different decisionmakers who intervene in a group decision making process with a reducednumber of actors, and to establish possible consensus paths between theseactors. As a methodological support, it employs one of the most widely-knownmulticriteria decision techniques, namely, the Analytic Hierarchy Process(AHP). Assuming that the judgements elicited by the decision makers follow theso-called multiplicative model (Crawford and Williams, 1985; Altuzarra et al.,1997; Laininen and Hämäläinen, 2003) with log-normal errors and unknownvariance, a Bayesian approach is used in the estimation of the relative prioritiesof the alternatives being compared. These priorities, estimated by way of themedian of the posterior distribution and normalised in a distributive manner(priorities add up to one), are a clear example of compositional data that will beused in the search for consensus between the actors involved in the resolution ofthe problem through the use of Multidimensional Scaling tools
Resumo:
One main assumption in the theory of rough sets applied to information tables is that the elements that exhibit the same information are indiscernible (similar) and form blocks that can be understood as elementary granules of knowledge about the universe. We propose a variant of this concept defining a measure of similarity between the elements of the universe in order to consider that two objects can be indiscernible even though they do not share all the attribute values because the knowledge is partial or uncertain. The set of similarities define a matrix of a fuzzy relation satisfying reflexivity and symmetry but transitivity thus a partition of the universe is not attained. This problem can be solved calculating its transitive closure what ensure a partition for each level belonging to the unit interval [0,1]. This procedure allows generalizing the theory of rough sets depending on the minimum level of similarity accepted. This new point of view increases the rough character of the data because increases the set of indiscernible objects. Finally, we apply our results to a not real application to be capable to remark the differences and the improvements between this methodology and the classical one
Resumo:
Background: None of the HIV T-cell vaccine candidates that have reached advanced clinical testing have been able to induce protective T cell immunity. A major reason for these failures may have been suboptimal T cell immunogen designs. Methods: To overcome this problem, we used a novel immunogen design approach that is based on functional T cell response data from more than 1,000 HIV-1 clade B and C infected individuals and which aims to direct the T cell response to the most vulnerable sites of HIV-1. Results: Our approach identified 16 regions in Gag, Pol, Vif and Nef that were relatively conserved and predominantly targeted by individuals with reduced viral loads. These regions formed the basis of the HIVACAT T-cell Immunogen (HTI) sequence which is 529 amino acids in length, includes more than 50 optimally defined CD4+ and CD8+ T-cell epitopes restricted by a wide range of HLA class I and II molecules and covers viral sites where mutations led to a dramatic reduction in viral replicative fitness. In both, C57BL/6 mice and Indian rhesus macaques immunized with an HTI-expressing DNA plasmid (DNA.HTI) induced broad and balanced T-cell responses to several segments within Gag, Pol, and Vif. DNA.HTI induced robust CD4+ and CD8+ T cell responses that were increased by a booster vaccination using modified virus Ankara (MVA.HTI), expanding the DNA.HTI induced response to up to 3.2% IFN-γ T-cells in macaques. HTI-specific T cells showed a central and effector memory phenotype with a significant fraction of the IFN-γ+ CD8+ T cells being Granzyme B+ and able to degranulate (CD107a+). Conclusions: These data demonstrate the immunogenicity of a novel HIV-1 T cell vaccine concept that induced broadly balanced responses to vulnerable sites of HIV-1 while avoiding the induction of responses to potential decoy targets that may divert effective T-cell responses towards variable and less protective viral determinants.
Resumo:
Methane combustion was studied by the Westbrook and Dryer model. This well-established simplified mechanism is very useful in combustion science, for computational effort can be notably reduced. In the inversion procedure to be studied, rate constants are obtained from [CO] concentration data. However, when inherent experimental errors in chemical concentrations are considered, an ill-conditioned inverse problem must be solved for which appropriate mathematical algorithms are needed. A recurrent neural network was chosen due to its numerical stability and robustness. The proposed methodology was compared against Simplex and Levenberg-Marquardt, the most used methods for optimization problems.
Resumo:
Implementering av ett informationssystem ur en organisatorisk synvinkel initieras av en idé om ett system och avslutas då användningen av det inte längre kräver en medveten ansträngning. Ifall tolkningen av implementering är denna, är det fråga om en långsam och komplicerad process, som berör organisationens alla parter. Ny informationsteknologi anses påverka flertalet arbetsprocesser och organiseringen av det dagliga arbetet. Möjligheterna att ta i bruk systemet och utnyttja det är många. I avhandlingen undersöks implementering av ett system för att administrera hemvårdsbesök där hemvårdare använde handdatorer för att registrera information om besökens längd och innehåll. I avhandlingen observeras vilka förändringar som sker i arbetets praxis p.g.a. det nya systemet och hur dessa förändringar påverkar vårdarbetet. Forskningen inleds med att strukturera teorier om arbetspraxis för kommande analys. Arbetspraxis är inarbetade och rutinmässiga arbetssätt i arbetets sociomateriella omgivning. Arbetspraxis i avhandlingen innebär hemvårdarens praxis och upplevd erfarenhet, där verksamheten informeras av gemensamma arbetssätt, projekt, identiteter och intressen. Organisationens auktoritet kommer även fram i den förverkligade arbetspraxisen. Forskningen genomfördes som en etnografisk longitudinell studie under åren 2001-2004. I studien observerades hur nyttjandet av handdatorerna framskred ur ett organisatoriskt perspektiv. Hemvårdares arbete och verksamhet (arbetspraxis) observerades både under vårdsbesök och under pauser. Därtill intervjuades hemvårdarna för att erhålla en bättre förståelse för de rationaliteter som styr arbetet och hur systemet togs i bruk. Dokument relaterade till projektet att införa ett nytt system och administrativa dokument har utnyttjats som källmaterial. Analysen av källmaterialet styrdes av det teoretiska tillvägagångssättet att undersöka arbetspraxis. Problem som identifierades i samband med införandet av systemet och de förändringar som det medförde analyserades i detalj. Parallellt analyserades organisatorisk makt, kontroll och arbetsidentitet. Undersökningen beskriver hur det nya systemet gradvis anpassades till hemvården efter ett initialt motstånd. Under själva implementering av systemet ifrågasattes tidigare arbetspraxis och inställningen till den eftersom arbetspraxisens materiella omgivning förändrades. Det teoretiska tillvägagångssättet i att undersöka arbetspraxis framhäver vårdarens agerande i förändringsprocessen. Resultatet av forskningen visar vikten av realistiska målsättningar, givande av gruppstöd med återkoppling samt förmåga att anpassa sig till det oväntade vid införande av informationssystem.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
The eucalyptus offers several advantages compared to other forestry species and, by using the irrigation can increase productivity and decrease production time. The objective of the present study was to evaluate two hybrid eucalyptus (Grancam and Urograndis), no irrigation, dripping and micro sprinkler irrigated at 90, 120, 150 and 180 days after transplanting (DAT). The experiment was conducted at the experimental irrigation area in the State University of Mato Grosso do Sul, in the municipality of Aquidauna - State of MS, Brazil. The experimental design was randomized blocks, split plot with four blocks and two replications within each block, and the plots were composed by irrigation treatments (dripping, micro sprinkler irrigate and dry) and the subplots the hybrids (Grancam and Urograndis).The total area of the experiment had 3 hectares, where each plot consisted of 1 ha. It was evaluated plant height, stem diameter and canopy, stem basal area, the relationship between height and stem diameter, the relationship between height and canopy diameter and stem volume. Data were subjected to analysis of variance and compared by Tukey test at 5% probability. Irrigation systems and dripping sprinkle provide greater plant height, stem diameter, canopy diameter, stem basal area and stem volume.
Resumo:
The aim of this study was to investigate the diagnosis delay and its impact on the stage of disease. The study also evaluated a nuclear DNA content, immunohistochemical expression of Ki-67 and bcl-2, and the correlation of these biological features with the clinicopathological features and patient outcome. 200 Libyan women, diagnosed during 2008–2009 were interviewed about the period from the first symptoms to the final histological diagnosis of breast cancer. Also retrospective preclinical and clinical data were collected from medical records on a form (questionnaire) in association with the interview. Tumor material of the patients was collected and nuclear DNA content analysed using DNA image cytometry. The expression of Ki-67 and bcl-2 were assessed using immunohistochemistry (IHC). The studies described in this thesis show that the median of diagnosis time for women with breast cancer was 7.5 months and 56% of patients were diagnosed within a period longer than 6 months. Inappropriate reassurance that the lump was benign was an important reason for prolongation of the diagnosis time. Diagnosis delay was also associated with initial breast symptom(s) that did not include a lump, old age, illiteracy, and history of benign fibrocystic disease. The patients who showed diagnosis delay had bigger tumour size (p<0.0001), positive lymph nodes (p<0.0001), and high incidence of late clinical stages (p<0.0001). Biologically, 82.7% of tumors were aneuploid and 17.3% were diploid. The median SPF of tumors was 11% while the median positivity of Ki-67 was 27.5%. High Ki-67 expression was found in 76% of patients, and high SPF values in 56% of patients. Positive bcl-2 expression was found in 62.4% of tumors. 72.2% of the bcl-2 positive samples were ER-positive. Patients who had tumor with DNA aneuploidy, high proliferative activity and negative bcl-2 expression were associated with a high grade of malignancy and short survival. The SPF value is useful cell proliferation marker in assessing prognosis, and the decision cut point of 11% for SPF in the Libyan material was clearly significant (p<0.0001). Bcl-2 is a powerful prognosticator and an independent predictor of breast cancer outcome in the Libyan material (p<0.0001). Libyan breast cancer was investigated in these studies from two different aspects: health services and biology. The results show that diagnosis delay is a very serious problem in Libya and is associated with complex interactions between many factors leading to advanced stages, and potentially to high mortality. Cytometric DNA variables, proliferative markers (Ki-67 and SPF), and oncoprotein bcl-2 negativity reflect the aggressive behavior of Libyan breast cancer and could be used with traditional factors to predict the outcome of individual patients, and to select appropriate therapy.
Resumo:
Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.
Resumo:
Med prediktion avses att man skattar det framtida värdet på en observerbar storhet. Kännetecknande för det bayesianska paradigmet är att osäkerhet gällande okända storheter uttrycks i form av sannolikheter. En bayesiansk prediktiv modell är således en sannolikhetsfördelning över de möjliga värden som en observerbar, men ännu inte observerad storhet kan anta. I de artiklar som ingår i avhandlingen utvecklas metoder, vilka bl.a. tillämpas i analys av kromatografiska data i brottsutredningar. Med undantag för den första artikeln, bygger samtliga metoder på bayesiansk prediktiv modellering. I artiklarna betraktas i huvudsak tre olika typer av problem relaterade till kromatografiska data: kvantifiering, parvis matchning och klustring. I den första artikeln utvecklas en icke-parametrisk modell för mätfel av kromatografiska analyser av alkoholhalt i blodet. I den andra artikeln utvecklas en prediktiv inferensmetod för jämförelse av två stickprov. Metoden tillämpas i den tredje artik eln för jämförelse av oljeprover i syfte att kunna identifiera den förorenande källan i samband med oljeutsläpp. I den fjärde artikeln härleds en prediktiv modell för klustring av data av blandad diskret och kontinuerlig typ, vilken bl.a. tillämpas i klassificering av amfetaminprover med avseende på produktionsomgångar.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Teaching, research, and herd breeding applications may require calculation of breed additive contributions for direct and maternal genetic effects and fractions of heterozygosity associated with breed specific direct and maternal heterosis effects. These coefficients can be obtained from the first NB rows of a pseudo numerator relationship matrix where the first NB rows represent fractional contributions by breed to each animal or group representing a specific breed cross. The table begins with an NB x NB identity matrix representing pure breeds. Initial animals or representative crosses must be purebreds or two-breed crosses. Parents of initial purebreds are represented by the corresponding column and initial two-breed cross progeny by the two corresponding columns of the identity matrix. After that, usual rules are used to calculate the NB column entries corresponding to breeds for each animal. The NB entries are fractions of genes expected to be contributed by each of the pure breeds and correspond to the breed additive direct fractions. Entries in the column corresponding to the dam represent breed additive maternal fractions. Breed specific direct heterozygosity coefficients are entries of an NB x NB matrix formed by the outer product of the two NB by 1 columns associated with sire and dam of the animal. One minus sum of the diagonals represents total direct heterozygosity. Similarly, the NB x NB matrix formed by the outer product of columns associated with sire of dam and dam of dam contains breed specific maternal heterozygosity coefficients. These steps can be programmed to create covariates to merge with data. If X represents these coefficients for all unique breed crosses, then the reduced row echelon form function of MATLAB or SAS can be used on X to determine estimable functions of additive breed direct and maternal effects and breed specific direct and maternal heterosis effects