903 resultados para Data-driven Methods
Resumo:
A fusão de imagens multisensor tem sido um procedimento amplamente utilizado em função da natureza complementar dos vários conjuntos de dados. Este artigo compara o desempenho de quatro métodos diferentes para fusão de imagens Landsat-7 ETM+ e RADARSAT-1 W1. A comparação foi baseada nas características espectrais das imagens, utilizando-se de análise estatística e visual dos produtos gerados. Quatro métodos foram usados para a fusão das imagens Landsat-7 ETM+ e RADARSAT-1 W1: i) fusão do SAR (radar de abertura sintética) com a tríade selecionada pelo OIF (Optimum Index Factor); ii) realce por decorrelação da tríade selecionada pelo OIF, seguida da fusão com SAR; iii) ACP (Análise por Componentes Principais) para as seis bandas ETM+ do espectro refletido (1, 2, 3, 4, 5 e 7) e posterior fusão das três primeiras componentes principais (1CP; 2CP; 3CP) com o SAR; iv) SPC-SAR (Principal Componente Seletivo - SAR). O produto SPC-SAR mostrou melhor desempenho na identificação das feições costeiras e permitiu o realce mais efetivo dos diferentes ambientes.
Resumo:
Pós-graduação em Engenharia Civil - FEIS
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The developed research aims to analyze clippings of Petrobras' commemorative 60th anniversary campaign, entitled Gente, é o que inspira a gente., launched in 2013, using concepts of institutional advertising from the perspective of semiotics. The company was chose due to its traditionalism and strength within and outside the country, being one of the 20 largest in the world. During the study, it will be covered topics such as organizational communication and institutional advertising. From analyzing three parts of the Campaign, which aims to influence public opinion about the organization, associating her concepts or images that benefit the way Petrobras is seen. To analyze the effectiveness and efficiency of the campaign, it was chosen as methodological method, the semiotic studies of Charles Peirce, with his concepts of firstness, secondness and thirdness. This approaching intention of its audience is clear in this campaign not only with the participation of employees and signs that compose it, but also with the dissemination of information about the company, such as history, data, policies, methods and services provided, transmitting effectively its essence as a company, through its mission, vision and explicit and implicit values in messages
Resumo:
A measurement of differential cross sections for the production of a pair of isolated photons in proton-proton collisions at root s = 7 TeV is presented. The data sample corresponds to an integrated luminosity of 5.0 fb(-1) collected with the CMS detector. A data-driven isolation template method is used to extract the prompt diphoton yield. The measured cross section for two isolated photons, with transverse energy above 40 and 25 GeV respectively, in the pseudorapidity range vertical bar eta vertical bar < 2.5, vertical bar eta vertical bar (sic) [1.44, 1.57] and with an angular separation Delta R > 0.45, is 17.2 +/-0.2 (stat) +/-1.9 (syst) +/- 0.4 (lumi) pb. Differential cross sections are measured as a function of the diphoton invariant mass, the diphoton transverse momentum, the azimuthal angle difference between the two photons, and the cosine of the polar angle in the Collins-Soper reference frame of the diphoton system. The results are compared to theoretical predictions at leading, next-to-leading, and next-to-next-to-leading order in quantum chromodynamics.
Resumo:
Disease transmission between wildlife and livestock is a worldwide issue. Society needs better methods to prevent interspecies transmission to reduce disease risks. Producers have successfully used livestock protection dogs (LPDs) for thousands of years to reduce predation. We theorized that LPDs raised and bonded with cattle could be used to also reduce risk of bovine tuberculosis (Myobacterium bovis; TB) transmission between white-tailed deer (Odocoileus virginianus) and cattle by minimizing contact between the 2 species and use of cattle feed by deer. We evaluated 4 LPDs over 5 months, utilizing 2 data collection methods (direct observation and motion-activated video) on deer farms that supported higher densities than wild populations. Dogs were highly effective in preventing deer from using concentrated cattle feed (hay bales), likely the greatest risk factor of TB transmission on farms. Dogs also prevented deer from approaching cattle in core areas of pastures (near hay bales) and were very effective throughout pastures. Our research supports the theory that LPDs, specifically trained to remain with cattle, may be a practical tool to minimize potential for livestock to contract TB from infected deer in small-scale cattle operations. Where disease is present in deer, it may be possible to reduce the potential for disease transmission by employing LPDs.
Resumo:
The main aim of this thesis is strongly interdisciplinary: it involves and presumes a knowledge on Neurophysiology, to understand the mechanisms that undergo the studied phenomena, a knowledge and experience on Electronics, necessary during the hardware experimental set-up to acquire neuronal data, on Informatics and programming to write the code necessary to control the behaviours of the subjects during experiments and the visual presentation of stimuli. At last, neuronal and statistical models should be well known to help in interpreting data. The project started with an accurate bibliographic research: until now the mechanism of perception of heading (or direction of motion) are still poorly known. The main interest is to understand how the integration of visual information relative to our motion with eye position information happens. To investigate the cortical response to visual stimuli in motion and the integration with eye position, we decided to study an animal model, using Optic Flow expansion and contraction as visual stimuli. In the first chapter of the thesis, the basic aims of the research project are presented, together with the reasons why it’s interesting and important to study perception of motion. Moreover, this chapter describes the methods my research group thought to be more adequate to contribute to scientific community and underlines my personal contribute to the project. The second chapter presents an overview on useful knowledge to follow the main part of the thesis: it starts with a brief introduction on central nervous system, on cortical functions, then it presents more deeply associations areas, which are the main target of our study. Furthermore, it tries to explain why studies on animal models are necessary to understand mechanism at a cellular level, that could not be addressed on any other way. In the second part of the chapter, basics on electrophysiology and cellular communication are presented, together with traditional neuronal data analysis methods. The third chapter is intended to be a helpful resource for future works in the laboratory: it presents the hardware used for experimental sessions, how to control animal behaviour during the experiments by means of C routines and a software, and how to present visual stimuli on a screen. The forth chapter is the main core of the research project and the thesis. In the methods, experimental paradigms, visual stimuli and data analysis are presented. In the results, cellular response of area PEc to visual stimuli in motion combined with different eye positions are shown. In brief, this study led to the identification of different cellular behaviour in relation to focus of expansion (the direction of motion given by the optic flow pattern) and eye position. The originality and importance of the results are pointed out in the conclusions: this is the first study aimed to investigate perception of motion in this particular cortical area. In the last paragraph, a neuronal network model is presented: the aim is simulating cellular pre-saccadic and post-saccadic response of neuron in area PEc, during eye movement tasks. The same data presented in chapter four, are further analysed in chapter fifth. The analysis started from the observation of the neuronal responses during 1s time period in which the visual stimulation was the same. It was clear that cells activities showed oscillations in time, that had been neglected by the previous analysis based on mean firing frequency. Results distinguished two cellular behaviour by their response characteristics: some neurons showed oscillations that changed depending on eye and optic flow position, while others kept the same oscillations characteristics independent of the stimulus. The last chapter discusses the results of the research project, comments the originality and interdisciplinary of the study and proposes some future developments.
Resumo:
In this study, conditions of deposition and stratigraphical architecture of Neogene (Tortonian, 11-6,7Ma) sediments of southern central Crete were analysed. In order to improve resolution of paleoclimatic data, new methods were applied to quantify environmental parameters and to increase the chronostratigraphic resolution in shallow water sediments. A relationship between paleoenvironmental change observed on Crete and global processes was established and a depositional model was developed. Based on a detailed analysis of the distribution of non geniculate coralline red algae, index values for water temperature and water depth were established and tested with the distribution patterns of benthic foraminifera and symbiont-bearing corals. Calcite shelled bivalves were sampled from the Algarve coast (southern Portugal) and central Crete and then 87Sr/86Sr was measured. A high resolution chronostratigraphy was developed based on the correlation between fluctuations in Sr ratios in the measured sections and in a late Miocene global seawater Sr isotope reference curve. Applying this method, a time frame was established to compare paleoenvironmental data from southern central Crete with global information on climate change reflected in oxygen isotope data. The comparison between paleotemperature data based on red algae and global oxygen isotope data showed that the employed index values reflect global change in temperature. Data indicate a warm interval during earliest Tortonian, a second short warm interval between 10 and 9,5Ma, a longer climatic optimum between 9 and 8Ma and an interval of increasing temperatures in the latest Tortonian. The distribution of coral reefs and carpets shows that during the warm intervals, the depositional environment became tropical while temperate climates prevailed during the cold interval. Since relative tectonic movements after initial half-graben formation in the early Tortonian were low in southern central Crete, sedimentary successions strongly respond to global sea-level fluctuation. A characteristic sedimentary succession formed during a 3rd order sea-level cycle: It comprises mixed siliciclastic-limestone deposited during sea-level fall and lowstand, homogenous red algal deposits formed during sea-level rise and coral carpets formed during late rise and highstand. Individual beds in the succession reflect glacioeustatic fluctuations that are most prominent in the mixed siliciclastic-limestone interval. These results confirm the fact that sedimentary successions deposited at the critical threshold between temperate and tropical environments develop characteristic changes in depositional systems and biotic associations that can be used to assemble paleoclimatic datasets.
Resumo:
In den letzten drei Jahrzehnten sind Fernerkundung und GIS in den Geowissenschaften zunehmend wichtiger geworden, um die konventionellen Methoden von Datensammlung und zur Herstellung von Landkarten zu verbessern. Die vorliegende Arbeit befasst sich mit der Anwendung von Fernerkundung und geographischen Informationssystemen (GIS) für geomorphologische Untersuchungen. Durch die Kombination beider Techniken ist es vor allem möglich geworden, geomorphologische Formen im Überblick und dennoch detailliert zu erfassen. Als Grundlagen werden in dieser Arbeit topographische und geologische Karten, Satellitenbilder und Klimadaten benutzt. Die Arbeit besteht aus 6 Kapiteln. Das erste Kapitel gibt einen allgemeinen Überblick über den Untersuchungsraum. Dieser umfasst folgende morphologische Einheiten, klimatischen Verhältnisse, insbesondere die Ariditätsindizes der Küsten- und Gebirgslandschaft sowie das Siedlungsmuster beschrieben. Kapitel 2 befasst sich mit der regionalen Geologie und Stratigraphie des Untersuchungsraumes. Es wird versucht, die Hauptformationen mit Hilfe von ETM-Satellitenbildern zu identifizieren. Angewandt werden hierzu folgende Methoden: Colour Band Composite, Image Rationing und die sog. überwachte Klassifikation. Kapitel 3 enthält eine Beschreibung der strukturell bedingten Oberflächenformen, um die Wechselwirkung zwischen Tektonik und geomorphologischen Prozessen aufzuklären. Es geht es um die vielfältigen Methoden, zum Beispiel das sog. Image Processing, um die im Gebirgskörper vorhandenen Lineamente einwandfrei zu deuten. Spezielle Filtermethoden werden angewandt, um die wichtigsten Lineamente zu kartieren. Kapitel 4 stellt den Versuch dar, mit Hilfe von aufbereiteten SRTM-Satellitenbildern eine automatisierte Erfassung des Gewässernetzes. Es wird ausführlich diskutiert, inwieweit bei diesen Arbeitsschritten die Qualität kleinmaßstäbiger SRTM-Satellitenbilder mit großmaßstäbigen topographischen Karten vergleichbar ist. Weiterhin werden hydrologische Parameter über eine qualitative und quantitative Analyse des Abflussregimes einzelner Wadis erfasst. Der Ursprung von Entwässerungssystemen wird auf der Basis geomorphologischer und geologischer Befunde interpretiert. Kapitel 5 befasst sich mit der Abschätzung der Gefahr episodischer Wadifluten. Die Wahrscheinlichkeit ihres jährlichen Auftretens bzw. des Auftretens starker Fluten im Abstand mehrerer Jahre wird in einer historischen Betrachtung bis 1921 zurückverfolgt. Die Bedeutung von Regentiefs, die sich über dem Roten Meer entwickeln, und die für eine Abflussbildung in Frage kommen, wird mit Hilfe der IDW-Methode (Inverse Distance Weighted) untersucht. Betrachtet werden außerdem weitere, regenbringende Wetterlagen mit Hilfe von Meteosat Infrarotbildern. Genauer betrachtet wird die Periode 1990-1997, in der kräftige, Wadifluten auslösende Regenfälle auftraten. Flutereignisse und Fluthöhe werden anhand von hydrographischen Daten (Pegelmessungen) ermittelt. Auch die Landnutzung und Siedlungsstruktur im Einzugsgebiet eines Wadis wird berücksichtigt. In Kapitel 6 geht es um die unterschiedlichen Küstenformen auf der Westseite des Roten Meeres zum Beispiel die Erosionsformen, Aufbauformen, untergetauchte Formen. Im abschließenden Teil geht es um die Stratigraphie und zeitliche Zuordnung von submarinen Terrassen auf Korallenriffen sowie den Vergleich mit anderen solcher Terrassen an der ägyptischen Rotmeerküste westlich und östlich der Sinai-Halbinsel.
Resumo:
Procedures for quantitative walking analysis include the assessment of body segment movements within defined gait cycles. Recently, methods to track human body motion using inertial measurement units have been suggested. It is not known if these techniques can be readily transferred to clinical measurement situations. This work investigates the aspects necessary for one inertial measurement unit mounted on the lower back to track orientation, and determine spatio-temporal features of gait outside the confines of a conventional gait laboratory. Apparent limitations of different inertial sensors can be overcome by fusing data using methods such as a Kalman filter. The benefits of optimizing such a filter for the type of motion are unknown. 3D accelerations and 3D angular velocities were collected for 18 healthy subjects while treadmill walking. Optimization of Kalman filter parameters improved pitch and roll angle estimates when compared to angles derived using stereophotogrammetry. A Weighted Fourier Linear Combiner method for estimating 3D orientation angles by constructing an analytical representation of angular velocities and allowing drift free integration is also presented. When tested this method provided accurate estimates of 3D orientation when compared to stereophotogrammetry. Methods to determine spatio-temporal features from lower trunk accelerations generally require knowledge of sensor alignment. A method was developed to estimate the instants of initial and final ground contact from accelerations measured by a waist mounted inertial device without rigorous alignment. A continuous wavelet transform method was used to filter and differentiate the signal and derive estimates of initial and final contact times. The technique was tested with data recorded for both healthy and pathologic (hemiplegia and Parkinson’s disease) subjects and validated using an instrumented mat. The results show that a single inertial measurement unit can assist whole body gait assessment however further investigation is required to understand altered gait timing in some pathological subjects.
Resumo:
Ein wichtiger Baustein für den langfristigen Erfolg einer Lebertransplantation ist die Compliance mit der lebenslang einzunehmenden immunsuppressiven Therapie. Im Rahmen der vorliegenden Arbeit wurde erstmals mittels MEMS® die Compliance bei lebertransplantierten Patienten untersucht, deren Transplantation einige Jahre zurücklag. Rekrutiert wurden Patienten, die vor 2, 5, 7 oder 10 Jahren (Gruppe 2 y.p.t., 5 y.p.t., 7 y.p.t., 10 y.p.t.) in der Universitätsmedizin Mainz lebertransplantiert wurden. 39 Patienten nahmen an der prospektiven Anwendungsbeobachtung teil. Die Compliance wurde mittels MEMS® über eine Beobachtungszeit von 6 Monaten erfasst. Bei der MEMS®-Auswertung war zu vermuten, dass 10 Patienten diese nicht wie vorgesehen verwendet hatten. Folglich konnten die mittels MEMS® gemessenen Compliance-Parameter nur für 29 Patienten valide ermittelt werden. Die mittlere Dosing Compliance betrug 81 ± 21 %, wobei die Gruppe 2 y.p.t. mit 86 ± 14 % bessere Werte zu verzeichnen hatte als die Gruppe 5 y.p.t. (75 ± 27 %) und 7 y.p.t. (74 ± 28 %). Die Ergebnisse waren jedoch nicht signifikant unterschiedlich (p=0,335, Kruskal-Wallis-Test). Unter Einbeziehung aller mittels MEMS® gemessenen Compliance-Parameter wurden 19 von 29 Patienten (66 %) als compliant eingestuft. Bei der Analyse der Gesamtcompliance basierend auf den subjektiven Compliance-Messmethoden (Morisky-Fragebogen, MESI-Fragebogen, Selbsteinschätzung), der Arzneimittel-Blutspiegel und der Anzahl an Abstoßungsreaktionen, in der alle 39 Patienten einbezogen werden konnten, wurden 35 Patienten (90 %) als compliant eingestuft. rnIm zweiten Teil der Arbeit wurde die Etablierung und Bewertung eines intersektoralen Pharmazeutischen Betreuungskonzepts für lebertransplantierte Patienten untersucht. Erstmals wurden anhand eines entwickelten schnittstellenübergreifenden, integrierten Betreuungskonzepts niedergelassene Apotheker in die Pharmazeutische Betreuung lebertransplantierter Patienten eingebunden. 20 Patienten wurden rekrutiert und während ihres stationären Aufenthaltes nach Transplantation pharmazeutisch betreut. Die Betreuung umfasste eine intensive Patientenschulung mit drei bis vier Gesprächen durch einen Krankenhausapotheker. Während des stationären Aufenthaltes wurden arzneimittelbezogene Probleme erkannt, gelöst und dokumentiert. Bei Entlassung stellte der Krankenhausapotheker einen Medikationsplan für den Hausarzt sowie für den niedergelassenen Apotheker aus und führte mit den Patienten ein ausführliches Entlassungsgespräch. Darüber hinaus wurden den Patienten Arzneimitteleinnahmepläne und eine Patienteninformation über ihr immunsuppressives Arzneimittel übergeben. 15 Patienten konnten daraufhin ambulant von niedergelassenen Apothekern pharmazeutisch weiterbetreut werden. Das kooperierende pharmazeutische Personal wurde durch ein eigens für die Studie erstelltes Manual zur Pharmazeutischen Betreuung lebertransplantierter Patienten geschult und unterstützt. Die niedergelassenen Apotheker sollten die Patienten in ihrer Arzneimitteltherapie begleiten, indem Beratungsgespräche geführt und arzneimittelbezogene Probleme erkannt und gelöst wurden. Die Nutzeffekte der intensiven Pharmazeutischen Betreuung konnte anhand verschiedener Erhebungsinstrumente dargelegt werden. Im Ergebnis resultierte eine hohe Zufriedenheit der Patienten und Apotheker mit dem Betreuungskonzept, die mittels Selbstbeurteilungsfragebögen ermittelt wurde. Die Compliance der Patienten wurde anhand des Morisky- und MESI-Fragebogens, der Selbsteinschätzung der Patienten, Blutspiegelbestimmungen sowie der Einschätzung durch den niedergelassenen Apotheker bestimmt. 86 % der Patienten wurden als compliant eingeordnet. Die Kenntnisse der Patienten über ihre immunsuppressive Therapie, welche anhand von Interviews erfragt wurden, lagen auf einem sehr hohen Niveau. Abschließend kann festgestellt werden, dass die Pharmazeutische Betreuung lebertransplantierter Patienten in den niedergelassenen Apotheken durchführbar ist. Anhand der Dokumentationsprotokolle lässt sich allerdings nur sehr schwer beurteilen, in welchem Maße die Betreuung tatsächlich erfolgte. Das tatsächliche vorliegen einer mangelnden Betreuung oder aber eine lückenhafte Dokumentation der Betreuungsleistung war nicht zu differenzieren. Ein limitierender Faktor für die intensivierte Betreuung ist sicherlich der erhebliche Aufwand für nur einen Patienten mit einem seltenen Krankheitsbild. Das Erkennen und Lösen von 48 ABP durch den Krankenhausapotheker und 32 ABP durch die niedergelassenen Apotheker, d. h. insgesamt 4,5 ABP pro Patient zeigt, dass die Pharmazeutische Betreuung einen wichtigen Beitrag für eine qualitätsgesicherte Arzneimitteltherapie leistet. Die intersektorale Pharmazeutische Betreuung stellt eine wesentliche Hilfe und Unterstützung der Patienten im sicheren Umgang mit ihrer Arzneimitteltherapie dar.rn
Resumo:
Thermal effects are rapidly gaining importance in nanometer heterogeneous integrated systems. Increased power density, coupled with spatio-temporal variability of chip workload, cause lateral and vertical temperature non-uniformities (variations) in the chip structure. The assumption of an uniform temperature for a large circuit leads to inaccurate determination of key design parameters. To improve design quality, we need precise estimation of temperature at detailed spatial resolution which is very computationally intensive. Consequently, thermal analysis of the designs needs to be done at multiple levels of granularity. To further investigate the flow of chip/package thermal analysis we exploit the Intel Single Chip Cloud Computer (SCC) and propose a methodology for calibration of SCC on-die temperature sensors. We also develop an infrastructure for online monitoring of SCC temperature sensor readings and SCC power consumption. Having the thermal simulation tool in hand, we propose MiMAPT, an approach for analyzing delay, power and temperature in digital integrated circuits. MiMAPT integrates seamlessly into industrial Front-end and Back-end chip design flows. It accounts for temperature non-uniformities and self-heating while performing analysis. Furthermore, we extend the temperature variation aware analysis of designs to 3D MPSoCs with Wide-I/O DRAM. We improve the DRAM refresh power by considering the lateral and vertical temperature variations in the 3D structure and adapting the per-DRAM-bank refresh period accordingly. We develop an advanced virtual platform which models the performance, power, and thermal behavior of a 3D-integrated MPSoC with Wide-I/O DRAMs in detail. Moving towards real-world multi-core heterogeneous SoC designs, a reconfigurable heterogeneous platform (ZYNQ) is exploited to further study the performance and energy efficiency of various CPU-accelerator data sharing methods in heterogeneous hardware architectures. A complete hardware accelerator featuring clusters of OpenRISC CPUs, with dynamic address remapping capability is built and verified on a real hardware.
Resumo:
La prova informatica richiede l’adozione di precauzioni come in un qualsiasi altro accertamento scientifico. Si fornisce una panoramica sugli aspetti metodologici e applicativi dell’informatica forense alla luce del recente standard ISO/IEC 27037:2012 in tema di trattamento del reperto informatico nelle fasi di identificazione, raccolta, acquisizione e conservazione del dato digitale. Tali metodologie si attengono scrupolosamente alle esigenze di integrità e autenticità richieste dalle norme in materia di informatica forense, in particolare della Legge 48/2008 di ratifica della Convenzione di Budapest sul Cybercrime. In merito al reato di pedopornografia si offre una rassegna della normativa comunitaria e nazionale, ponendo l’enfasi sugli aspetti rilevanti ai fini dell’analisi forense. Rilevato che il file sharing su reti peer-to-peer è il canale sul quale maggiormente si concentra lo scambio di materiale illecito, si fornisce una panoramica dei protocolli e dei sistemi maggiormente diffusi, ponendo enfasi sulla rete eDonkey e il software eMule che trovano ampia diffusione tra gli utenti italiani. Si accenna alle problematiche che si incontrano nelle attività di indagine e di repressione del fenomeno, di competenza delle forze di polizia, per poi concentrarsi e fornire il contributo rilevante in tema di analisi forensi di sistemi informatici sequestrati a soggetti indagati (o imputati) di reato di pedopornografia: la progettazione e l’implementazione di eMuleForensic consente di svolgere in maniera estremamente precisa e rapida le operazioni di analisi degli eventi che si verificano utilizzando il software di file sharing eMule; il software è disponibile sia in rete all’url http://www.emuleforensic.com, sia come tool all’interno della distribuzione forense DEFT. Infine si fornisce una proposta di protocollo operativo per l’analisi forense di sistemi informatici coinvolti in indagini forensi di pedopornografia.
Resumo:
Falls are common and burdensome accidents among the elderly. About one third of the population aged 65 years or more experience at least one fall each year. Fall risk assessment is believed to be beneficial for fall prevention. This thesis is about prognostic tools for falls for community-dwelling older adults. We provide an overview of the state of the art. We then take different approaches: we propose a theoretical probabilistic model to investigate some properties of prognostic tools for falls; we present a tool whose parameters were derived from data of the literature; we train and test a data-driven prognostic tool. Finally, we present some preliminary results on prediction of falls through features extracted from wearable inertial sensors. Heterogeneity in validation results are expected from theoretical considerations and are observed from empirical data. Differences in studies design hinder comparability and collaborative research. According to the multifactorial etiology of falls, assessment on multiple risk factors is needed in order to achieve good predictive accuracy.
Resumo:
The Standard Model of particle physics is a very successful theory which describes nearly all known processes of particle physics very precisely. Nevertheless, there are several observations which cannot be explained within the existing theory. In this thesis, two analyses with high energy electrons and positrons using data of the ATLAS detector are presented. One, probing the Standard Model of particle physics and another searching for phenomena beyond the Standard Model.rnThe production of an electron-positron pair via the Drell-Yan process leads to a very clean signature in the detector with low background contributions. This allows for a very precise measurement of the cross-section and can be used as a precision test of perturbative quantum chromodynamics (pQCD) where this process has been calculated at next-to-next-to-leading order (NNLO). The invariant mass spectrum mee is sensitive to parton distribution functions (PFDs), in particular to the poorly known distribution of antiquarks at large momentum fraction (Bjoerken x). The measurementrnof the high-mass Drell-Yan cross-section in proton-proton collisions at a center-of-mass energy of sqrt(s) = 7 TeV is performed on a dataset collected with the ATLAS detector, corresponding to an integrated luminosity of 4.7 fb-1. The differential cross-section of pp -> Z/gamma + X -> e+e- + X is measured as a function of the invariant mass in the range 116 GeV < mee < 1500 GeV. The background is estimated using a data driven method and Monte Carlo simulations. The final cross-section is corrected for detector effects and different levels of final state radiation corrections. A comparison isrnmade to various event generators and to predictions of pQCD calculations at NNLO. A good agreement within the uncertainties between measured cross-sections and Standard Model predictions is observed.rnExamples of observed phenomena which can not be explained by the Standard Model are the amount of dark matter in the universe and neutrino oscillations. To explain these phenomena several extensions of the Standard Model are proposed, some of them leading to new processes with a high multiplicity of electrons and/or positrons in the final state. A model independent search in multi-object final states, with objects defined as electrons and positrons, is performed to search for these phenomenas. Therndataset collected at a center-of-mass energy of sqrt(s) = 8 TeV, corresponding to an integrated luminosity of 20.3 fb-1 is used. The events are separated in different categories using the object multiplicity. The data-driven background method, already used for the cross-section measurement was developed further for up to five objects to get an estimation of the number of events including fake contributions. Within the uncertainties the comparison between data and Standard Model predictions shows no significant deviations.