960 resultados para Ghost reflection analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The delay caused by the reflected ray in broadband communication has a great influence on the communications in subway tunnel. This paper presents measurements taken in subway tunnels at 2.4 GHz, with 5 MHz bandwidth. According to propagation characteristics of tunnel, the measurements were carried out with a frequency domain channel sounding technique, in three typical scenarios: line of sight (LOS), Non-line-of-sight (NLOS) and far line of sight (FLOS), which lead to different delay distributions. Firstly IFFT was chosen to get channel impulse response (CIR) h(t) from measured three-dimensional transfer functions. Power delay profile (PDP) was investigated to give an overview of broadband channel model. Thereafter, a long delay caused by the obturation of tunnel is observed and investigated in all the scenarios. The measurements show that the reflection can be greatly remained by the tunnel, which leads to long delay cluster where the reflection, but direct ray, makes the main contribution for radio wave propagation. Four important parameters: distribution of whole PDP power, first peak arriving time, reflection cluster duration and PDP power distribution of reflection cluster were studied to give a detailed description of long delay characteristic in tunnel. This can be used to ensure high capacity communication in tunnels

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negative Refractive Lens (NRL) has shown that an optical system can produce images with details below the classic Abbe diffraction limit using materials of negative dielectric and magnetic constants. Recently, two devices with positive refraction, the Maxwell Fish Eye lens (MFE) (Leonhardt et al 2000) and the Spherical Geodesic Waveguide (SGW)(Minano et all 2011) have been claimed to break the diffraction limit using positive refraction with a different meaning. In these cases, it has been considered the power transmission from a point source to a point receptor, which falls drastically when the receptor is displaced from the focus by a distance much smaller than the wavelength. Moreover, recent analysis of the SGW with defined object and image surfaces, which are both conical sections of the sphere, has shown that the system transmits images bellow diffraction limit. The key assumption is the use of a perfectly absorbing receptor called perfect drain. This receptor is capable to absorb all the radiation without reflection or scattering. Here, it is presented the COMSOL analysis of the SGW using a perfect drain that absorbs perfectly two modes. The design procedure for PD capable to absorb k modes is proposed, as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Análisis de la atenuación del oleaje por un carguero funcionando como dique flotante y aplicación a dos casos de protección portuaria y costera. The effectiveness of a bulk carrier working as a detached floating breakwater to protect a stretch of coast and form salients or tombolos is assessed in this paper. Experiments were conducted in the Madrid CEDEX facilities in a 30 m long, 3 m wide, 1/150 scale flume. The bulk carrier ship is 205 m long, 29 m wide and 18 m in height with a draught of 13 m, and has been subjected to irregular waves with significant heights from 2 m to 4 m and peak periods from 6 s to 12 s at a depth of 15 m, all prototype dimensions. Three probes were placed between the wave paddle and the ship to record incident and reflected waves and four probes were placed between the ship and the coastline to measure the transmitted waves. Transmission, reflection and dissipation coefficients (Ct, Cr, Cd) were calculated to determine wave attenuation. Results show good shelter in the lee of the ship with values of Ct under 0.5 for peak periods from 6 s to 11 s. In addition, forces on the mooring chains were measured showing maximum values of about 2000 tons at a 10 speak period. Finally, two analytical models were used to determine the shoreline’s response to the ship’s protection and to assess the possible forming of salients or tombolos. According to the results, salients - but not tombolos - are formed in all tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se ha utilizado un programa de modelización de ondas sísmicas por métodos finitos en dos dimensiones para analizar el efecto Source Ghost en profundidades de 4, 14, 24 y 34 metros. Este efecto se produce cuando se dispara una fuente enterrada y, debido al contacto suelo-aire, se genera una onda reflejada que, en cierto momento, se superpone con la onda principal, produciéndose una disminución de la amplitud de la onda (Source Ghost). Los resultados teóricos del efecto se han comparado con los resultados prácticos del programa de modelización concluyéndose que es posible determinar el rango de frecuencias afectado por el efecto. Sin embargo, la distancia entre receptor y fuente es una nueva variable que desplaza el efecto hacia frecuencias más altas impidiendo su predicción. La utilización de una técnica de procesamiento básica como la corrección del Normal Move-Out (NMO) en el apilado de las trazas, contrarresta la variable distancia receptor-fuente, y por tanto es posible calcular el rango de frecuencias del efecto Source Ghost. Abstract A seismic wave forward modeling in two dimensions using finite-difference method has been used for analyzing the Source Ghost effect at depths between 4-34 meters. A shot from a buried source generates a down going reflection due to the free surface boundary and, at some point, it interferes with the main wave propagation causing a reduction of wave amplitude at some frequency range (Source Ghost). Theoretical results and experimental results provided by the forward modeling are compared for concluding that the forward modeling is able to identify the frequency range affected by the source ghost. Nevertheless, it has been found that the receiver-source distance (offset) is a new variable that modifies the frequency range to make it unpredictable. A basic seismic processing technique, Normal Move-Out (NMO) correction, has been used for a single twenty fold CMP gather. The final stack shows that the processing technique neutralize the offset effect and therefore the forward modeling is still capable to determine the affected frequency range by the source ghost regardless the distance between receiver and source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two objects with homologous landmarks are said to be of the same shape if the configuration of landmarks of one object can be exactly matched with that of the other by translation, rotation/reflection, and scaling. In an earlier paper, the authors proposed statistical analysis of shape by considering logarithmic differences of all possible Euclidean distances between landmarks. Tests of significance for differences in the shape of objects and methods of discrimination between populations were developed with such data. In the present paper, the corresponding statistical methodology is developed by triangulation of the landmarks and by considering the angles as natural measurements of shape. This method is applied to the study of sexual dimorphism in hominids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two objects with homologous landmarks are said to be of the same shape if the configurations of landmarks of one object can be exactly matched with that of the other by translation, rotation/reflection, and scaling. The observations on an object are coordinates of its landmarks with reference to a set of orthogonal coordinate axes in an appropriate dimensional space. The origin, choice of units, and orientation of the coordinate axes with respect to an object may be different from object to object. In such a case, how do we quantify the shape of an object, find the mean and variation of shape in a population of objects, compare the mean shapes in two or more different populations, and discriminate between objects belonging to two or more different shape distributions. We develop some methods that are invariant to translation, rotation, and scaling of the observations on each object and thereby provide generalizations of multivariate methods for shape analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we prensent an analysis of non-slanted reflection gratings by using exact solution of the second order differential equation derived from Maxwell equations, in terms of Mathieu functions. The results obtained by using this method will be compared to those obtained by using the well known Kogelnik's Coupled Wave Theory which predicts with great accuracy the response of the efficieny of the zero and first order for volume phase gratings, for both reflection and transmission gratings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extent to which the surface parameters of Progressive Addition Lenses (PALs) affect successful patient tolerance was investigated. Several optico-physical evaluation techniques were employed, including a newly constructed surface reflection device which was shown to be of value for assessing semi-finished PAL blanks. Detailed physical analysis was undertaken using a computer-controlled focimeter and from these data, iso-cylindrical and mean spherical plots were produced for each PAL studied. Base curve power was shown to have little impact upon the distribution of PAL astigmatism. A power increase in reading addition primarily caused a lengthening and narrowing of the lens progression channel. Empirical measurements also indicated a marginal steepening of the progression power gradient with an increase in reading addition power. A sample of the PAL wearing population were studied using patient records and questionnaire analysis (90% were returned). This subjective analysis revealed the reading portion to be the most troublesome lens zone and showed that patients with high astigmatism (> 2.00D) adapt more readily to PALs than those with spherical or low cylindrical (2.00D) corrections. The psychophysical features of PALs were then investigated. Both grafting visual acuity (VA) and contrast sensitivity (CS) were shown to be reduced with an increase in eccentricity from the central umbilical line. Two sample populations (N= 20) of successful and unsuccessful PAL wearers were assessed for differences in their visual performance and their adaptation to optically induced distortion. The possibility of dispensing errors being the cause of poor patient tolerance amongst the unsuccessful wearer group was investigated and discounted. The contrast sensitivity of the successful group was significantly greater than that of the unsuccessful group. No differences in adaptation to or detection of curvature distortion were evinced between these presbyopic groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although there has been an increased interest in the use of electronic portfolios in higher education over the last five years, relatively little is known about the potential of such tools to support the development of higher order abilities for students, such as reflection, in a structured way that is suitable for assessment. This paper reports the findings from a small-scale research which sets out to compare the outcomes of reflective assignments in two cohorts of participants in a Postgraduate Certificate in Professional Practice in Higher Education in the UK. Participants in the programme were asked to submit reflective accounts using an e-portfolio system as part of their formal assessment. One cohort completed the assessment using some generic guidelines of how to reflect and construct an e-portfolio page without a given template or structure, whereas another cohort was given a specific template with clear assessment criteria to gauge the assembly of their reflections. The authors, who are also tutors in the programme, analysed the submitted reflections following open coding procedures. The analysis found a tendency for the reflection in the first cohort to be merely descriptive without progressing to speculating objectively about answers to relevant analytical questions about the process involved in the ability under scrutiny. In contrast the assignments of cohort two were found to be more insightful in terms of assimilating random bits of materials, thoughts and self-questions into complete reflective accounts. These findings bring some evidence to support and indeed promote a more structured approach to reflective practice, which can be further enhanced through a carefully created e-portfolio template and associated assessment criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - It is important to advance operations management (OM) knowledge while being mindful of the theoretical developments of the discipline. The purpose of this paper is to explore which theoretical perspectives have dominated the OM field. This analysis allows the authors to identify theory trends and gaps in the literature and to identify fruitful areas for future research. A reflection on theory is also practical, given that it guides research toward important questions and enlightens OM practitioners. Design/methodology/approach - The authors provide an analysis of OM theory developments in the last 30 years. The study encompasses three decades of OM publications across three OM journals and contains an analysis of over 3,000 articles so as to identify which theories, over time, have been adopted by authors in order to understand OM topics. Findings - The authors find that the majority of studies are atheoretical, empirical, and focussed upon theory testing rather than on theory development. Some theories, such as the resource-based view and contingency theory, have an enduring relevance within OM. The authors also identify theories from psychology, economics, sociology, and organizational behavior that may, in the future, have salience to explain burgeoning OM research areas such as servitization and sustainability. Research limitations/implications - The study makes a novel contribution by exploring which main theories have been adopted or developed in OM, doing so by systematically analyzing articles from the three main journals in the field (the Journal of Operations Management, Production and Operations Management, and the International Journal of Operations and Production Management), which encompass three decades of OM publications. In order to focus the study, the authors may have missed important OM articles in other journals. Practical implications - A reflection on theories is important because theories inform how a researcher or practicing manager interprets and solves OM problems. This study allows the authors to reflect on the collective OM journey to date, to spot trends and gaps in the literature, and to identify fruitful areas for future research. Originality/value - As far as the authors are aware, there has not been an assessment of the main theoretical perspectives in OM. The research also identifies which topics are published in OM journals, and which theories are adopted to investigate them. The authors also reflect on whether the most cited papers and those winning best paper awards are theoretical. This gives the authors a richer understanding of the current state of OM research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Science professional development, which is fundamental to science education improvement, has been described as being weak and fragmentary. The purpose of this study was to investigate teachers' perceptions of informal science professional development to gain an in-depth understanding of the essence of the phenomenon and related science-teaching dispositions. Based on the frameworks of phenomenology, constructivism, and adult learning theory, the focus was on understanding how the phenomenon was experienced within the context of teachers' everyday world. ^ Data were collected from eight middle-school teachers purposefully selected because they had participated in informal programs during Project TRIPS (Teaching Revitalized Through Informal Programs in Science), a collaboration between the Miami-Dade school district, government agencies (including NASA), and non-profit organizations (including Audubon of Florida). In addition, the teachers experienced hands-on labs offered through universities (including the University of Arizona), field sites, and other agencies. ^ The study employed Seidman's (1991) three-interview series to collect the data. Several methods were used to enhance the credibility of the research, including using triangulation of the data. The interviews were transcribed, color-coded and organized into six themes that emerged from the data. The themes included: (a) internalized content knowledge, (b) correlated hands-on activities, (c) enhanced science-teaching disposition, (d) networking/camaraderie, (e) change of context, and (f) acknowledgment as professionals. The teachers identified supportive elements and constraints related to each theme. ^ The results indicated that informal programs offering experiential learning opportunities strengthened understanding of content knowledge. Teachers implemented hands-on activities that were explicitly correlated to their curriculum. Programs that were conducted in a relaxed context enhanced teachers' science-teaching dispositions. However, a lack of financial and administrative support, perceived safety risks, insufficient reflection time, and unclear itineraries impeded program implementation. The results illustrated how informal educators can use this cohesive model as they develop programs that address the supports and constraints to teachers' science instruction needs. This, in turn, can aid teachers as they strive to provide effective science instruction to students; notions embedded in reforms. Ultimately, this can affect how learners develop the ability to make informed science decisions that impact the quality of life on a global scale. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the study was to examine the relationship between teacher beliefs and actual classroom practice in early literacy instruction. Conjoint analysis was used to measure teachers' beliefs on four early literacy factors—phonological awareness, print awareness, graphophonic awareness, and structural awareness. A collective case study format was then used to measure the correspondence of teachers' beliefs with their actual classroom practice. ^ Ninety Project READS participants were given twelve cards in an orthogonal experimental design describing students that either met or did not meet criteria on the four early literacy factors. Conjoint measurements of whether the student is an efficient reader were taken. These measurements provided relative importance scores for each respondent. Based on the relative important scores, four teachers were chosen to participate in a collective case study. ^ The conjoint results enabled the clustering of teachers into four distinct groups, each aligned with one of the four early literacy factors. K-means cluster analysis of the relative importance measurements showed commonalities among the ninety respondents' beliefs. The collective case study results were mixed. Implications for researchers and practitioners include the use of conjoint analysis in measuring teacher beliefs on the four early literacy factors. Further, the understanding of teacher preferences on these beliefs may assist in the development of curriculum design and therefore increase educational effectiveness. Finally, comparisons between teachers' beliefs on the four early literacy factors and actual instructional practices may facilitate teacher self-reflection thus encouraging positive teacher change. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this mixed methods study was to understand physics Learning Assistants' (LAs) views on reflective teaching, expertise in teaching, and LA program teaching experience and to determine if views predicted level of reflection evident in writing. Interviews were conducted in Phase One, Q methodology was used in Phase Two, and level of reflection in participants' writing was assessed using a rubric based on Hatton and Smith's (1995) "Criteria for the Recognition of Evidence for Different Types of Reflective Writing" in Phase Three. Interview analysis revealed varying perspectives on content knowledge, pedagogical knowledge, and experience in relation to expertise in teaching. Participants revealed that they engaged in reflection on their teaching, believed reflection helps teachers improve, and found peer reflection beneficial. Participants believed teaching experience in the LA program provided preparation for teaching, but that more preparation was needed to teach. Three typologies emerged in Phase Two. Type One LAs found participation in the LA program rewarding and believed expertise in teaching does not require expertise in content or pedagogy, but it develops over time from reflection. Type Two LAs valued reflection, but not writing reflections, felt the LA program teaching experience helped them decide on non-teaching careers and helped them confront gaps in their physics knowledge. Type Three LAs valued reflection, believed expertise in content and pedagogy are necessary for expert teaching, and felt LA program teaching experience increased their likelihood of becoming teachers, but did not prepare them for teaching. Writing assignments submitted in Phase Three were categorized as 19% descriptive writing, 60% descriptive reflections, and 21% dialogic reflections. No assignments were categorized as critical reflection. Using ordinal logistic regression, typologies that emerged in Phase Two were not found to be predictors for the level of reflection evident in the writing assignments. In conclusion, viewpoints of physics LAs were revealed, typologies among them were discovered, and their writing gave evidence of their ability to reflect on teaching. These findings may benefit faculty and staff in the LA program by helping them better understand the views of physics LAs and how to assess their various forms of reflection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article considers the opportunities of civilians to peacefully resist violent conflicts or civil wars. The argument developed here is based on a field-based research on the peace community San José de Apartadó in Colombia. The analytical and theoretical framework, which delimits the use of the term ‘resistance’ in this article, builds on the conceptual considerations of Hollander and Einwohner (2004) and on the theoretical concept of ‘rightful resistance’ developed by O’Brien (1996). Beginning with a conflict-analytical classification of the case study, we will describe the long-term socio-historical processes and the organizational experiences of the civilian population, which favoured the emergence of this resistance initiative. The analytical approach to the dimensions and aims of the resistance of this peace community leads to the differentiation of O`Brian’s concept of ‘rightful resistance’.