978 resultados para Irregular Polygon


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the technologies for the fabrication of high quality microarray advances rapidly, quantification of microarray data becomes a major task. Gridding is the first step in the analysis of microarray images for locating the subarrays and individual spots within each subarray. For accurate gridding of high-density microarray images, in the presence of contamination and background noise, precise calculation of parameters is essential. This paper presents an accurate fully automatic gridding method for locating suarrays and individual spots using the intensity projection profile of the most suitable subimage. The method is capable of processing the image without any user intervention and does not demand any input parameters as many other commercial and academic packages. According to results obtained, the accuracy of our algorithm is between 95-100% for microarray images with coefficient of variation less than two. Experimental results show that the method is capable of gridding microarray images with irregular spots, varying surface intensity distribution and with more than 50% contamination

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Marine product export does something pivotal in the fish export economy of Kerala. The post WTO period has witnessed a strengthening of food safety and quality standards applied on food products in the developed countries. In the case of the primary importers, like the EU, the US and Japan, market actions will have far reaching reverberations and implications for the marine product exports from developing nations. The article focuses on Kerala’s marine product exports that had been targeting the markets of the EU, the US and Japan, and the concomitant shift in markets owing to the stringent stipulations under the WTO regime. Despite the overwhelming importance of the EU in the marine product exports of the state, the pronounced influence of irregular components on the quantity and value of marine product exports to the EU in the post WTO period raises concern. However, the tendencies of market diversification validated by the forecast generated for the emerging markets of the SEA, the MEA and others, to an extent, allay the pressures on the marine product export sector of the state which had hitherto relied heavily on the markets of the EU, the US and Japan

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heterochromatin Protein 1 (HP1) is an evolutionarily conserved protein required for formation of a higher-order chromatin structures and epigenetic gene silencing. The objective of the present work was to functionally characterise HP1-like proteins in Dictyostelium discoideum, and to investigate their function in heterochromatin formation and transcriptional gene silencing. The Dictyostelium genome encodes three HP1-like proteins (hcpA, hcpB, hcpC), from which only two, hcpA and hcpB, but not hcpC were found to be expressed during vegetative growth and under developmental conditions. Therefore, hcpC, albeit no obvious pseudogene, was excluded from this study. Both HcpA and HcpB show the characteristic conserved domain structure of HP1 proteins, consisting of an N-terminal chromo domain and a C-terminal chromo shadow domain, which are separated by a hinge. Both proteins show all biochemical activities characteristic for HP1 proteins, such as homo- and heterodimerisation in vitro and in vivo, and DNA binding activtity. HcpA furthermore seems to bind to K9-methylated histone H3 in vitro. The proteins thus appear to be structurally and functionally conserved in Dictyostelium. The proteins display largely identical subnuclear distribution in several minor foci and concentration in one major cluster at the nuclear periphery. The localisation of this cluster adjacent to the nucleus-associated centrosome and its mitotic behaviour strongly suggest that it represents centromeric heterochromatin. Furthermore, it is characterised by histone H3 lysine-9 dimethylation (H3K9me2), which is another hallmark of Dictyostelium heterochromatin. Therefore, one important aspect of the work was to characterise the so-far largely unknown structural organisation of centromeric heterochromatin. The Dictyostelium homologue of inner centromere protein INCENP (DdINCENP), co-localized with both HcpA and H3K9me2 during metaphase, providing further evidence that H3K9me2 and HcpA/B localisation represent centromeric heterochromatin. Chromatin immunoprecipitation (ChIP) showed that two types of high-copy number retrotransposons (DIRS-1 and skipper), which form large irregular arrays at the chromosome ends, which are thought to contain the Dictyostelium centromeres, are characterised by H3K9me2. Neither overexpression of full-length HcpA or HcpB, nor deletion of single Hcp isoforms resulted in changes in retrotransposon transcript levels. However, overexpression of a C-terminally truncated HcpA protein, assumed to display a dominant negative effect, lead to an increase in skipper retrotransposon transcript levels. Furthermore, overexpression of this protein lead to severe growth defects in axenic suspension culture and reduced cell viability. In order to elucidate the proteins functions in centromeric heterochromatin formation, gene knock-outs for both hcpA and hcpB were generated. Both genes could be successfully targeted and disrupted by homologous recombination. Surprisingly, the degree of functional redundancy of the two isoforms was, although not unexpected, very high. Both single knock-out mutants did not show any obvious phenotypes under standard laboratory conditions and only deletion of hcpA resulted in subtle growth phenotypes when grown at low temperature. All attempts to generate a double null mutant failed. However, both endogenous genes could be disrupted in cells in which a rescue construct that ectopically expressed one of the isoforms either with N-terminal 6xHis- or GFP-tag had been introduced. The data imply that the presence of at least one Hcp isoform is essential in Dictyostelium. The lethality of the hcpA/hcpB double mutant thus greatly hampered functional analysis of the two genes. However, the experiment provided genetic evidence that the GFP-HcpA fusion protein, because of its ability to compensate the loss of the endogenous HcpA protein, was a functional protein. The proteins displayed quantitative differences in dimerisation behaviour, which are conferred by the slightly different hinge and chromo shadow domains at the C-termini. Dimerisation preferences in increasing order were HcpA-HcpA << HcpA-HcpB << HcpB-HcpB. Overexpression of GFP-HcpA or a chimeric protein containing the HcpA C-terminus (GFP-HcpBNAC), but not overexpression of GFP-HcpB or GFP-HcpANBC, lead to increased frequencies of anaphase bridges in late mitotic cells, which are thought to be caused by telomere-telomere fusions. Chromatin targeting of the two proteins is achieved by at least two distinct mechanisms. The N-terminal chromo domain and hinge of the proteins are required for targeting to centromeric heterochromatin, while the C-terminal portion encoding the CSD is required for targeting to several other chromatin regions at the nuclear periphery that are characterised by H3K9me2. Targeting to centromeric heterochromatin likely involves direct binding to DNA. The Dictyostelium genome encodes for all subunits of the origin recognition complex (ORC), which is a possible upstream component of HP1 targeting to chromatin. Overexpression of GFP-tagged OrcB, the Dictyostelium Orc2 homologue, showed a distinct nuclear localisation that partially overlapped with the HcpA distribution. Furthermore, GFP-OrcB localized to the centrosome during the entire cell cycle, indicating an involvement in centrosome function. DnmA is the sole DNA methyltransferase in Dictyostelium required for all DNA(cytosine-)methylation. To test for its in vivo activity, two different cell lines were established that ectopically expressed DnmA-myc or DnmA-GFP. It was assumed that overexpression of these proteins might cause an increase in the 5-methyl-cytosine(5-mC)-levels in the genomic DNA due to genomic hypermethylation. Although DnmA-GFP showed preferential localisation in the nucleus, no changes in the 5-mC-levels in the genomic DNA could be detected by capillary electrophoresis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die vorliegende Arbeit stellt die erforderlichen theoretischen Zusammenhänge zur Berechnung von rotierenden elastischen Strukturen wie etwa Turbinen- und Verdichterschaufeln, Laufräder, Scheiben etc. zusammen und zeigt die Entwicklung eines entsprechenden FEM-Programm-Systems. Es ermöglicht die Berechnung der Eigenfrequenzen und Eigenformen von Einzelstrukturen und rotations-periodischen Strukturen in Abhängigkeit von der Drehfrequenz unter Einbeziehung aller wesentlichen Effekte. Weiterhin ist es möglich für einen beliebigen durch ein Polygon angenäherten Drehzahl-Zeit-Verlauf die erzwungenen Schwingungen und daraus resultierend die Spannungsverläufe über der Zeit zu berechnen. Hierauf aufbauend können die Lebensdauer der Struktur abgeschätzt und Parameterstudien durchgeführt werden.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In der Arbeit werden zunächst die wesentlichsten Fakten über Schiefpolynome wiederholt, der Fokus liegt dabei auf Shift- und q-Shift-Operatoren in Charakteristik Null. Alle für die Arithmetik mit diesen Objekten notwendigen Konzepte und Algorithmen finden sich im ersten Kapitel. Einige der zur Bestimmung von Lösungen notwendigen Daten können aus dem Newtonpolygon, einer den Operatoren zugeordneten geometrischen Figur, abgelesen werden. Die Herleitung dieser Zusammenhänge ist das Thema des zweiten Kapitels der Arbeit, wobei dies insbesondere im q-Shift-Fall in dieser Form neu ist. Das dritte Kapitel beschäftigt sich mit der Bestimmung polynomieller und rationaler Lösungen dieser Operatoren, dabei folgt es im Wesentlichen der Darstellung von Mark van Hoeij. Der für die Faktorisierung von (q-)Shift Operatoren interessanteste Fall sind die sogenannten (q-)hypergeometrischen Lösungen, die direkt zu Rechtsfaktoren erster Ordnung korrespondieren. Im vierten Kapitel wird der van Hoeij-Algorithmus vom Shift- auf den q-Shift-Fall übertragen. Außerdem wird eine deutliche Verbesserung des q-Petkovsek-Algorithmus mit Hilfe der Daten des Newtonpolygons hergeleitet. Das fünfte Kapitel widmet sich der Berechnung allgemeiner Faktoren, wozu zunächst der adjungierte Operator eingeführt wird, der die Berechnung von Linksfaktoren erlaubt. Dann wird ein Algorithmus zur Berechnung von Rechtsfaktoren beliebiger Ordnung dargestellt. Für die praktische Benutzung ist dies allerdings für höhere Ordnungen unpraktikabel. Bei fast allen vorgestellten Algorithmen tritt das Lösen linearer Gleichungssysteme über rationalen Funktionenkörpern als Zwischenschritt auf. Dies ist in den meisten Computeralgebrasystemen nicht befriedigend gelöst. Aus diesem Grund wird im letzten Kapitel ein auf Evaluation und Interpolation basierender Algorithmus zur Lösung dieses Problems vorgestellt, der in allen getesteten Systemen den Standard-Algorithmen deutlich überlegen ist. Alle Algorithmen der Arbeit sind in einem MuPAD-Package implementiert, das der Arbeit beiliegt und eine komfortable Handhabung der auftretenden Objekte erlaubt. Mit diesem Paket können in MuPAD nun viele Probleme gelöst werden, für die es vorher keine Funktionen gab.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die fliegerische Tätigkeit auf der Kurzstrecke in der zivilen Luftfahrt unterliegt arbeitsspezifischen Belastungsfaktoren, die sich in wesentlichen Punkten von denen auf der Langstrecke unterscheiden. Eine hohe Arbeitsbelastung auf der Kurzstrecke ist mit vielen Starts und Landungen am Tag verbunden. Neben der Anzahl der Flugabschnitte können auch lange Flugdienstzeiten und/oder unregelmäßige Arbeitszeiten sowie der Zeitdruck während der Einsätze auf der Kurzstrecke zur Belastung für Cockpitbesatzungsmitglieder werden und zu Ermüdungserscheinungen führen. Bisher wurden flugmedizinische und -psychologische Daten hauptsächlich auf der Langstrecke in Bezug auf die Auswirkungen der Jet-Leg Symptomatik und kaum auf der Kurzstrecke erhoben. Deshalb wurde im Rahmen des DLR- Projekts „Untersuchungen zu kumulativen psychischen und physiologischen Effekten des fliegenden Personals auf der Kurzstrecke“ eine Langzeituntersuchung zur Belastung/Beanspruchung, Ermüdung sowie Erholung des Cockpitpersonals auf der Kurzstrecke über jeweils 56 Tage durchgeführt. In Zusammenarbeit mit der Deutschen Lufthansa AG dauerte die Untersuchung zu den Auswirkungen arbeitsspezifischer Belastungsfaktoren auf die Cockpitbesatzungsmitglieder der Boeing 737-Flotte von 2003 bis 2006. ZIEL: Unter Berücksichtigung theoretisch fundierter arbeitspsychologischer Konzepte war das Ziel der Studie, kumulative und akute Effekte auf das Schlaf-Wach-Verhalten, auf die Belastung/Beanspruchung sowie auf die Müdigkeit zu identifizieren, die durch aufeinander folgende Einsätze auf der Kurzstrecke innerhalb eines Zeitraums von acht Wochen auftreten können. Hierfür wurden Daten von 29 Piloten (N=13 Kapitäne; N=16 Erste Offiziere) aufgezeichnet. Das Durchschnittsalter lag bei 33,8 ± 7,9 Jahren (Kapitäne: 42,0 ± 3,8 Jahre; Erste Offiziere: 27,4 ± 2,2 Jahre). METHODEN: Über ein Handheld PC konnten effizient Fragebögen bearbeitet und das Sleep Log sowie das Flight Log geführt werden. Die subjektive Ermüdung und Arbeitsbeanspruchung wurden durch standardisierte Fragebögen (z.B. Ermüdungsskala von Samn & Perelli (1982), NASA-TLX) operationalisiert. Im Sleep Log und im Flight Log wurden das Schlaf-Wach-Verhalten sowie flugspezifische Daten dokumentiert (z.B. Dienstbeginn, Dienstende, Flugabschnitte, Zielorte, etc.). Der Schlaf-Wach-Zyklus wurde mittels der Aktimetrie während des gesamten Messverlaufs aufgezeichnet. Die objektive Leistungsfähigkeit wurde täglich morgens und abends mit Hilfe einer computergestützten Psychomotor Vigilance Task (PVT) nach Dinges & Powell (1985) erfasst. Die Leistung in der PVT diente als Indikator für die Ermüdung eines Piloten. Zusätzliche Befragungen mit Paper-Pencil-Fragebögen sollten Aufschluss über relevante, psychosoziale Randbedingungen geben, die bei den täglichen Erhebungen nicht berücksichtigt wurden (z.B. Arbeitszufriedenheit; Essgewohnheiten; Kollegenbeziehungen). ERGEBNISSE: Unter Beachtung kumulativer Effekte wurde über die Studiendauer keine Veränderung in der Schlafqualität und im Schlafbedürfnis festgestellt. Die Müdigkeit nahm dagegen während der achtwöchigen Untersuchung zu. Die Reaktionszeit in der PVT zeigte an Flugdiensttagen eine Verschlechterung über die Zeit. Insgesamt wurden keine kritischen längerfristigen Effekte analysiert. Akute signifikante Effekte wurden bei der Ermüdung, der Gesamtbelastung und der Leistungsfähigkeit an Flugdiensttagen gefunden. Die Ermüdung als auch die Gesamtbelastung stiegen bei zunehmender Flugdienstdauer und Leganzahl und die Leistung nahm in der PVT ab. Der „time on task“ Effekt zeigte sich besonders in der Ermüdung durch die fliegerische Tätigkeit ab einer Flugdienstzeit von > 10 Stunden und > 4 Legs pro Tag. SCHLUSSFOLGERUNG: Mit diesen Ergebnissen konnte eine wissenschaftliche Datenbasis geschaffen werden aus der Empfehlungen resultieren, wie die Einsatzplanung für das Cockpitpersonal auf der Kurzstrecke unter flugmedizinischen und flugpsychologischen Gesichtspunkten optimiert werden kann. Zudem kann ein sachgerechter Beitrag im Rahmen der Diskussion zur Flugdienst- und Ruhezeitenregelung auf europäischer Ebene geleistet werden.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pastoralism and ranching are two different rangeland-based livestock systems in dryland areas of East Africa. Both usually operate under low and irregular rainfall and consequently low overall primary biomass production of high spatial and temporal heterogeneity. Both are usually located far from town centres, market outlets and communication, medical, educational, banking, insurance and other infrastructure. Whereas pastoralists can be regarded as self-employed, gaining their livelihood from managing their individually owned livestock on communal land, ranches mostly employ herders as wage labourers to manage the livestock owned by the ranch on the ranches’ own land property. Both production systems can be similarly labour intensive and – with regard to the livestock management – require the same type of work, whether carried out as self-employed pastoralist or as employed herder on a work contract. Given this similarity, the aim of this study was to comparatively assess how pastoralists and employed herders in northern Kenya view their working conditions, and which criteria they use to assess hardship and rewards in their daily work and their working life. Their own perception is compared with the concept of Decent Work developed by the International Labour Organisation (ILO). Samburu pastoralists in Marsabit and Samburu Districts as well as herders on ranches in Laikipia District were interviewed. A qualitative analysis of 47 semi-structured interviews yielded information about daily activities, income, free time, education and social security. Five out of 22 open interviews with pastoralists and seven out of 13 open interviews with employed herders fully transcribed and subjected to qualitative content analysis to yield life stories of 12 informants. Pastoralists consider it important to have healthy and satisfied animals. The ability to provide food for their family especially for the children has a high priority. Hardships for the pastoralists are, if activities are exhausting, and challenging, and dangerous. For employed herders, decent conditions are if their wages are high enough to be able to provide food for their family and formal education for their children. It is further most important for them to do work they are experienced and skilled in. Most employed herders were former pastoralists, who had lost their animals due to drought or raids. There are parallels between the ILO ‘Decent Work’ concept and the perception of working conditions of pastoralists and employed herders. These are, for example, that remuneration is of importance and the appreciation by either the employer or the community is desired. Some aspects that are seen as important by the ILO such as safety at work and healthy working conditions only play a secondary role to the pastoralists, who see risky and dangerous tasks as inherent characteristics of their efforts to gain a livelihood in their living environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R-package “compositions”is a tool for advanced compositional analysis. Its basic functionality has seen some conceptual improvement, containing now some facilities to work with and represent ilr bases built from balances, and an elaborated subsys- tem for dealing with several kinds of irregular data: (rounded or structural) zeroes, incomplete observations and outliers. The general approach to these irregularities is based on subcompositions: for an irregular datum, one can distinguish a “regular” sub- composition (where all parts are actually observed and the datum behaves typically) and a “problematic” subcomposition (with those unobserved, zero or rounded parts, or else where the datum shows an erratic or atypical behaviour). Systematic classification schemes are proposed for both outliers and missing values (including zeros) focusing on the nature of irregularities in the datum subcomposition(s). To compute statistics with values missing at random and structural zeros, a projection approach is implemented: a given datum contributes to the estimation of the desired parameters only on the subcompositon where it was observed. For data sets with values below the detection limit, two different approaches are provided: the well-known imputation technique, and also the projection approach. To compute statistics in the presence of outliers, robust statistics are adapted to the characteristics of compositional data, based on the minimum covariance determinant approach. The outlier classification is based on four different models of outlier occur- rence and Monte-Carlo-based tests for their characterization. Furthermore the package provides special plots helping to understand the nature of outliers in the dataset. Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator, robustness, rounded zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dar una visión de la Sociedad de Amigos del País de Málaga y las actividades que llevan a cabo durante los años 1906-1926 y su repercusión en la vida social y cultural. Elaborar un marco referencial histórico-pedagógico y descubrir si esta sociedad hizo de la educación el instrumento de la reforma para solucionar los males del país. Se aborda y se analiza la sociedad española del siglo XVIII, que hace que nazcan las Sociedades Económicas de Amigos del País. Elaborar un marco histórico-pedagógico en la sociedad malagueña de esta época y la necesidad de crear una sociedad económica. Documentos de archivo, legajos, boletínes, actas y otros documentos escritos. Deducción y análisis de las estructuras y de la organización del tema objeto de estudio. El estudio de las Sociedades Económicas de Amigos del País han permitido el conocimiento de una serie de hechos: su nacimiento surge como portavoz a las ideas del gobierno; su finalidad es el desarrollo de la agricultura, comercio y la insustria, así como el fomento de las ideas ilustradas; su trayectoria fue irregular, ya que varias veces desaparecen para aparecer más tarde; y la finaciación era a través de cuotas de los socios. En Málaga, nace como iniciativa gubernamental con las características ya descritas. Se dan clases gratuitas y se establece un cuadro de asignaturas, profesorado, etc. Realizan actividades culturales para el beneficio de la juventud malagueña y destacó la figura de D. Pedro Gómez Chaix, autor de la construcción del barrio obrero América, así como del Ateneo Comercial, la Biblioteca Popular, etc..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contexto histórico-político y socio-cultural en el que se produce la creación y desarrollo de los centros de formación del profesorado de Educación Física. Descripción de instituciones que se consideran antecedentes, y estudio de las que se han dedicado a esta formación desde 1805. Análisis de la incorporación de los currícula a la escuela. Análisis y valoración de la legislación referida a la formación del profesorado de Educación Física. Consideración de las expectativas de futuro y propuesta de modelo. En la investigación histórica se ha utilizado el método analítico y el método dialéctico, también se ha utilizado la investigación descriptiva, y métodos de análisis de documentos. Se ha recurrido a fuentes primarias, secundarias y archivos. Técnicas de análisis de contenido con unidades de base no gramáticas y análisis por documentos íntegros. La formación del profesorado se plantea en España en el primer tercio del siglo XIX. Desde el principio la formación de profesores se ha realizado: Escuela Primaria en las Escuelas Normales y Escuela Secundaria y Superior en Facultades Superiores. La formación del profesorado en Educación Física se institucionaliza en 1883 con la apertura de la 'Escuela Central de Gimnástica'. La Educación Física aparece de modo irregular e intermitente en los planes de estudios del siglo XIX y, siempre en la Segunda Enseñanza. Hasta avanzado el siglo XX no se incluirá en la Secundaria. En la universidad se ha primado. La situación académica, profesional y laboral del profesorado de Educación Física comienza a resolverse con la convocatoria de oposiciones al cuerpo de profesores agregados de instituto y profesores numerarios de Formación Profesional, 1985. Debe tenderse a que el profesorado realice estudios superiores. El profesor de Educación Física debe recibir una profunda formación psicopedagógica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is often associated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcifications is performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcifications have been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the sense of adding new features not only related to the shape

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen del autor. Res??menes en castellano e ingl??s

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Desde el año 2004 la Organización de Estados Americanos, a través de su Misión de Acompañamiento al Proceso de Paz –MAPP/OEA- en Colombia, estableció una cooperación técnica para hacer un seguimiento, acompañamiento y monitoreo al proceso de Desarme, Desmovilización y Reintegración –DDR- de los miembros de los grupos armados ilegales del país, en particular de las Autodefensas Unidas de Colombia (paramilitares) . Su trabajo ha permitido un conocimiento público del proceso con el grupo irregular y además está sirviendo como herramienta de aprendizaje para otros casos de DDR actuales y futuros.