58 resultados para OPNET Modeler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette recherche vise à mettre en lumière la nature des représentations sociales des enseignantes du primaire au sein d’une commission scolaire québécoise à l’égard des disciplines liées au domaine de l’univers social, et aussi de l’enseignement de celles-ci. La recherche a pris naissance à la suite de nombreuses interventions dans les classes du primaire en tant que superviseur de stage et de discussions avec les enseignantes relatives à l’enseignement des sciences humaines. Ce projet repose ses fondements sur le constat suivant : l’apprentissage au primaire des savoirs et des compétences en univers social est déficient et ne répond pas complètement aux attentes prescrites par le ministère de l’Éducation, du Loisir et du Sport. Les enseignantes du primaire n’enseignent que très peu les savoirs et les compétences en univers social au Québec; ainsi, les élèves arrivent peu outillés au secondaire. Des recherches antérieures ont permis de cibler certains facteurs permettant d’expliquer cet état de fait. Or, nous supposions qu’il existait d’autres raisons pouvant expliquer ce phénomène et nous croyions que l’analyse des représentations sociales des enseignantes pourrait apporter des informations importantes dans l’analyse de cette problématique. Cette analyse des représentations sociales est basée sur les travaux et les théories relatives au noyau central (Abric, 1994a). Elle fut construite autour d’une recherche exploratoire au sein d’une commission scolaire en région où 21 enseignantes ont été interviewées. Utilisant une méthodologie qualitative avec une approche s’adressant plus particulièrement aux sciences de l’éducation (Merriam, 1998), les résultats de la recherche nous permettent d’identifier trois facteurs déterminants dans la création des représentations sociales des enseignantes à l’égard de l’histoire, de la géographie et de l’éducation à la citoyenneté. Ces facteurs amènent également les enseignantes à modeler leurs approches pédagogiques et didactiques quant à l’enseignement de l’univers social au primaire. Cette recherche a d’ailleurs permis de mieux comprendre la création des représentations sociales des enseignantes quant aux disciplines associées aux sciences humaines et permis de cibler plusieurs facteurs déterminants de cette réticence à enseigner cette matière aux élèves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le conseil épicurien d’éviter la participation politique a reçu maintes interprétations, souvent obscures et mal fondées. L’attitude apolitique ne peut être définie comme un simple manque d’intérêt ou de préoccupation pour la politique ; en effet, selon l’opinion de Pierre Hadot, la philosophie ancienne est profondément ancrée dans l’existence et les doctrines philosophiques n’acquièrent de l’importance que lorsqu’elles assistent le praxis. L’attitude d’Épicure est donc enracinée dans le refus de vivre selon des normes prescrites par l’établissement politique. Selon lui, la politique traditionnelle est vouée à l’échec puisqu’elle poursuit aveuglément le pouvoir et la richesse. En réaction à cette situation, Épicure crée une communauté qui instaure de nouvelles valeurs et au sein de laquelle il est possible de vivre conformément à ces nouvelles valeurs. Se situant en totale opposition aux modes de vie les plus fondamentaux de la cité, les adeptes d’Épicure, s’ils participaient à la vie politique, déclencheraient une grande hostilité de la part des partisans des valeurs traditionnelles. Pour cette raison, l’attitude épicurienne peut d’abord représenter une manière d’éviter la persécution politique. De plus, s’il est admis que la politique implique la poursuite du pouvoir, les épicuriens ne peuvent s’y adonner puisque cela serait contradictoire à leur quête d’ataraxie. À tous égards et indépendamment de ces deux motifs justifiant le retrait de la vie politique, il est clair que si l’attitude d’Épicure ne reposait pas sur une conscience politique, alors, ses critiques à l’égard de la vie politique, son désir de s’y soustraire et la création d’une communauté distincte n’auraient pas été. La politique a le pouvoir de profondément modeler la vie des gens. Considérant que ce conditionnement s’appuie sur des valeurs malsaines, le projet épicurien s’applique donc à remodeler, à la lumière de nouvelles valeurs, la vie de ceux qui ne trouvent aucune satisfaction à poursuivre la vie de la cité.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The popularity of wireless local area networks (WLANs) has resulted in their dense deployments around the world. While this increases capacity and coverage, the problem of increased interference can severely degrade the performance of WLANs. However, the impact of interference on throughput in dense WLANs with multiple access points (APs) has had very limited prior research. This is believed to be due to 1) the inaccurate assumption that throughput is always a monotonically decreasing function of interference and 2) the prohibitively high complexity of an accurate analytical model. In this work, firstly we provide a useful classification of commonly found interference scenarios. Secondly, we investigate the impact of interference on throughput for each class based on an approach that determines the possibility of parallel transmissions. Extensive packet-level simulations using OPNET have been performed to support the observations made. Interestingly, results have shown that in some topologies, increased interference can lead to higher throughput and vice versa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo deste estudo foi o de investigar a criação de uma "cultura" pedagógica, com e sem a influência do professor, sujeito de status definido institucionalmente. Com tal finalidade, foi montado um cenário experimental com sujeitos distribuídos em Grupos de Controle e Experimentais, estes últimos contando com a participação de uma professora, previamente instruída, para exercer influência. Segundo a estratégia usada, os sujeitos realizaram o Exercício de Decisão por Consenso, através de interpretações e classificações, que foram consideradas as "culturas" criadas pelos grupos, com e sem a influência do professor. Os dados empírico recolhidos e analisados segundo uma Abordagem Fenomenológica permitiram concluir que a atuação do professor é decisiva para influenciar o comportamento dos alunos e que esta influência pode ser atribuída preponderadamente ao status a ele conferido. A literatura revista possibilitou ainda uma análise crítica do papel de modelador exercido pelo professor (Abordagem Modernizante), bem como do processo de reprodução da cultura dominante nesta tarefa (Abordagem Alternativa). Desta forma, os dados empíricos e as teorias analisadas permitiram concluir que a "cultura" pedagógica, criada seja na presença seja na ausência do professor, não é isenta e tampouco apolítica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mensurar os impactos do tráfego de aplicações interativas para TV Digital nas redes sem fio de 4a geração em especial WIMAX e femtocell tem sido um grande desafio para pesquisadores em todo o mundo, o teste destas tecnologias se mostra promissora para a melhor qualidade de serviço prestado pelas operadoras. Inicialmente, identificando o padrão de tráfego de rede através da aferição e caracterização de tráfego de uma aplicação interativa de TV Digital do Sistema Brasileiro de TV Digital (SBTVD). A partir de então, as simulações são feitas sobre uma rede sem fio. Para este estudo, a rede WiMAX foi escolhida como um de nossos estudo de caso. Um estudo dos impactos do uso desta aplicação em uma WMAN (Wireless Metropolitan Área Network) com WiMAX e também utilizando femtocells. Evidentemente, tecnologias sem fio apresentam uma grande variação da qualidade do sinal. Deste modo, é necessário utilizar uma solução para reduzir essa degradação no sinal. Dentre as possíveis soluções, o uso de femtocells surge como alternativa viável para estas melhorias, lembrando que uso de femtocell esta diretamente ligado para áreas onde o sinal é zero ou muito baixo. A utilização de simulações discretas através de ferramentas apropriadas como o OPNET, se mostram bastante úteis para viabilizar a utilização das tecnologias existentes, expondo-as a condições mais adversas de fluxo, carga , numero de usuários e distancias que certamente influenciam no desempenha de cada uma delas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’uso frequente dei modelli predittivi per l’analisi di sistemi complessi, naturali o artificiali, sta cambiando il tradizionale approccio alle problematiche ambientali e di rischio. Il continuo miglioramento delle capacità di elaborazione dei computer facilita l’utilizzo e la risoluzione di metodi numerici basati su una discretizzazione spazio-temporale che permette una modellizzazione predittiva di sistemi reali complessi, riproducendo l’evoluzione dei loro patterns spaziali ed calcolando il grado di precisione della simulazione. In questa tesi presentiamo una applicazione di differenti metodi predittivi (Geomatico, Reti Neurali, Land Cover Modeler e Dinamica EGO) in un’area test del Petén, Guatemala. Durante gli ultimi decenni questa regione, inclusa nella Riserva di Biosfera Maya, ha conosciuto una rapida crescita demografica ed un’incontrollata pressione sulle sue risorse naturali. L’area test puó essere suddivisa in sotto-regioni caratterizzate da differenti dinamiche di uso del suolo. Comprendere e quantificare queste differenze permette una migliore approssimazione del sistema reale; é inoltre necessario integrare tutti i parametri fisici e socio-economici, per una rappresentazione più completa della complessità dell’impatto antropico. Data l’assenza di informazioni dettagliate sull’area di studio, quasi tutti i dati sono stati ricavati dall’elaborazione di 11 immagini ETM+, TM e SPOT; abbiamo poi realizzato un’analisi multitemporale dei cambi uso del suolo passati e costruito l’input per alimentare i modelli predittivi. I dati del 1998 e 2000 sono stati usati per la fase di calibrazione per simulare i cambiamenti nella copertura terrestre del 2003, scelta come data di riferimento per la validazione dei risultati. Quest’ultima permette di evidenziare le qualità ed i limiti per ogni modello nelle differenti sub-regioni.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La ricerca ha preso le mosse da tre ipotesi fondamentali: 1) Esiste un legame tra processi cognitivi di basso ed alto livello; 2) Lo spazio senso-motorio è una percezione soggettiva; 3) Lo spazio senso-motorio varia in funzione delle diverse modalità di interazione sociale. La tesi sostiene che lo spazio senso-motorio si lascia modulare dalla semplice co-presenza di un altro agente umano e da interazioni cooperative e non cooperative. I capitoli I, II, III, hanno lo scopo di scomporre e spiegare il significato della prima, seconda e terza ipotesi; giungendo a formulare la tesi centrale che sarà poi dimostrata sperimentalmente nel capitolo IV. Il capitolo V introduce future linee di ricerca nell’ambito dell’etica proponendo una nuova ipotesi sul legame che potrebbe sussistere tra la percezione dello spazio durante l’interazione sociale e i giudizi morali. Il lavoro svolto chiama ad operare insieme diverse discipline che concorrono a formare le scienze cognitive: la storia della filosofia, la filosofia della mente contemporanea, la neuropsicologia sperimentale ed alcuni temi della psicologia sociale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMEN En los últimos años, debido al incremento en la demanda por parte de las empresas de tecnologías que posibiliten la monitorización y el análisis de un gran volumen de datos en tiempo real, la tecnología CEP (Complex Event Processing) ha surgido como una potencia en alza y su uso se ha incrementado notablemente en ciertos sectores como, por ejemplo, la gestión y automatización de procesos de negocios, finanzas, monitorización de redes y aplicaciones, así como redes de sensores inteligentes como el caso de estudio en el que nos centraremos. CEP se basa en un lenguaje de procesamiento de eventos (Event Processing Language,EPL) cuya utilización puede resultar bastante compleja para usuarios inexpertos. Esta complejidad supone un hándicap y, por lo tanto, un problema a la hora de que su uso se extienda. Este Proyecto Fin de Grado (PFG) pretende dar una solución a este problema, acercando al usuario la tecnología CEP mediante técnicas de abstracción y modelado. Para ello, este PFG ha definido un lenguaje de modelado específico dominio, sencillo e intuitivo para el usuario inexperto, al que se ha dado soporte mediante el desarrollo de una herramienta de modelado gráfico (CEP Modeler) en la que se pueden modelar consultas CEP de forma gráfica, sencilla y de manera más accesible para el usuario. ABSTRACT Over recent years, more and more companies demand technology for monitoring and analyzing a vast volume of data in real time. In this regard, the CEP technology (Complex Event Processing) has emerged as a novel approach to that end, and its use has increased dramatically in certain domains, such as, management and automation of business processes, finance, monitoring of networks and applications, as well as smart sensor networks as the case study in which we will focus. CEP is based on in the Event Processing Language (EPL). This language can be rather difficult to use for new users. This complexity can be a handicap, and therefore, a problem at the time of extending its use. This project aims to provide a solution to this problem, trying to approach the CEP technology to users through abstraction and modelling techniques. To that end, this project has defined an intuitive and simple domain-specific modelling language for new users through a web tool (CEP Modeler) for graphically modeling CEP queries, in an easier and more accessible way.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los modelos ecológicos se han convertido en una pieza clave de esta ciencia. La generación de conocimiento se consigue en buena medida mediante procesos analíticos más o menos complejos aplicados sobre conjuntos de datos diversos. Pero buena parte del conocimiento necesario para diseñar e implementar esos modelos no está accesible a la comunidad científica. Proponemos la creación de herramientas informáticas para documentar, almacenar y ejecutar modelos ecológicos y flujos de trabajo. Estas herramientas (repositorios de modelos) están siendo desarrolladas por otras disciplinas como la biología molecular o las ciencias de la Tierra. Presentamos un repositorio de modelos (ModeleR) desarrollado en el contexto del Observatorio de seguimiento del cambio global de Sierra Nevada (Granada-Almería). Creemos que los repositorios de modelos fomentarán la cooperación entre científicos, mejorando la creación de conocimiento relevante que podría ser transferido a los tomadores de decisiones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We performed a quantitative comparison of brittle thrust wedge experiments to evaluate the variability among analogue models and to appraise the reproducibility and limits of model interpretation. Fifteen analogue modeling laboratories participated in this benchmark initiative. Each laboratory received a shipment of the same type of quartz and corundum sand and all laboratories adhered to a stringent model building protocol and used the same type of foil to cover base and sidewalls of the sandbox. Sieve structure, sifting height, filling rate, and details on off-scraping of excess sand followed prescribed procedures. Our analogue benchmark shows that even for simple plane-strain experiments with prescribed stringent model construction techniques, quantitative model results show variability, most notably for surface slope, thrust spacing and number of forward and backthrusts. One of the sources of the variability in model results is related to slight variations in how sand is deposited in the sandbox. Small changes in sifting height, sifting rate, and scraping will result in slightly heterogeneous material bulk densities, which will affect the mechanical properties of the sand, and will result in lateral and vertical differences in peak and boundary friction angles, as well as cohesion values once the model is constructed. Initial variations in basal friction are inferred to play the most important role in causing model variability. Our comparison shows that the human factor plays a decisive role, and even when one modeler repeats the same experiment, quantitative model results still show variability. Our observations highlight the limits of up-scaling quantitative analogue model results to nature or for making comparisons with numerical models. The frictional behavior of sand is highly sensitive to small variations in material state or experimental set-up, and hence, it will remain difficult to scale quantitative results such as number of thrusts, thrust spacing, and pop-up width from model to nature.