911 resultados para Electronics in military engineering.
Resumo:
Experimental software engineering includes several processes, the most representative being run experiments, run replications and synthesize the results of multiple replications. Of these processes, only the first is relatively well established in software engineering. Problems of information management and communication among researchers are one of the obstacles to progress in the replication and synthesis processes. Software engineering experimentation has expanded considerably over the last few years. This has brought with it the invention of experimental process support proposals. However, few of these proposals provide integral support, including replication and synthesis processes. Most of the proposals focus on experiment execution. This paper proposes an infrastructure providing integral support for the experimental research process, specializing in the replication and synthesis of a family of experiments. The research has been divided into stages or phases, whose transition milestones are marked by the attainment of their goals. Each goal exactly matches an artifact or product. Within each stage, we will adopt cycles of successive approximations (generateand- test cycles), where each approximation includes a diferent viewpoint or input. Each cycle will end with the product approval.
Resumo:
Cualquier estructura vibra según unas frecuencias propias definidas por sus parámetros modales (frecuencias naturales, amortiguamientos y formas modales). A través de las mediciones de la vibración en puntos clave de la estructura, los parámetros modales pueden ser estimados. En estructuras civiles, es difícil excitar una estructura de manera controlada, por lo tanto, las técnicas que implican la estimación de los parámetros modales sólo registrando su respuesta son de vital importancia para este tipo de estructuras. Esta técnica se conoce como Análisis Modal Operacional (OMA). La técnica del OMA no necesita excitar artificialmente la estructura, atendiendo únicamente a su comportamiento en servicio. La motivación para llevar a cabo pruebas de OMA surge en el campo de la Ingeniería Civil, debido a que excitar artificialmente con éxito grandes estructuras no sólo resulta difícil y costoso, sino que puede incluso dañarse la estructura. Su importancia reside en que el comportamiento global de una estructura está directamente relacionado con sus parámetros modales, y cualquier variación de rigidez, masa o condiciones de apoyo, aunque sean locales, quedan reflejadas en los parámetros modales. Por lo tanto, esta identificación puede integrarse en un sistema de vigilancia de la integridad estructural. La principal dificultad para el uso de los parámetros modales estimados mediante OMA son las incertidumbres asociadas a este proceso de estimación. Existen incertidumbres en el valor de los parámetros modales asociadas al proceso de cálculo (internos) y también asociadas a la influencia de los factores ambientales (externas), como es la temperatura. Este Trabajo Fin de Máster analiza estas dos fuentes de incertidumbre. Es decir, en primer lugar, para una estructura de laboratorio, se estudian y cuantifican las incertidumbres asociadas al programa de OMA utilizado. En segundo lugar, para una estructura en servicio (una pasarela de banda tesa), se estudian tanto el efecto del programa OMA como la influencia del factor ambiental en la estimación de los parámetros modales. Más concretamente, se ha propuesto un método para hacer un seguimiento de las frecuencias naturales de un mismo modo. Este método incluye un modelo de regresión lineal múltiple que permite eliminar la influencia de estos agentes externos. A structure vibrates according to some of its vibration modes, defined by their modal parameters (natural frequencies, damping ratios and modal shapes). Through the measurements of the vibration at key points of the structure, the modal parameters can be estimated. In civil engineering structures, it is difficult to excite structures in a controlled manner, thus, techniques involving output-only modal estimation are of vital importance for these structure. This techniques are known as Operational Modal Analysis (OMA). The OMA technique does not need to excite artificially the structure, this considers its behavior in service only. The motivation for carrying out OMA tests arises in the area of Civil Engineering, because successfully artificially excite large structures is difficult and expensive. It also may even damage the structure. The main goal is that the global behavior of a structure is directly related to their modal parameters, and any variation of stiffness, mass or support conditions, although it is local, is also reflected in the modal parameters. Therefore, this identification may be within a Structural Health Monitoring system. The main difficulty for using the modal parameters estimated by an OMA is the uncertainties associated to this estimation process. Thus, there are uncertainties in the value of the modal parameters associated to the computing process (internal) and the influence of environmental factors (external), such as the temperature. This Master’s Thesis analyzes these two sources of uncertainties. That is, firstly, for a lab structure, the uncertainties associated to the OMA program used are studied and quantified. Secondly, for an in-service structure (a stress-ribbon footbridge), both the effect of the OMA program and the influence of environmental factor on the modal parameters estimation are studied. More concretely, a method to track natural frequencies of the same mode has been proposed. This method includes a multiple linear regression model that allows to remove the influence of these external agents.
Resumo:
The Bologna Declaration and the implementation of the European Higher Education Area are promoting the use of active learning methodologies. The aim of this study is to evaluate the effects obtained after applying active learning methodologies to the achievement of generic competences as well as to the academic performance. This study has been carried out at the Universidad Politécnica de Madrid, where these methodologies have been applied to the Operating Systems I subject of the degree in Technical Engineering in Computer Systems. The fundamental hypothesis tested was whether the implementation of active learning methodologies (cooperative learning and problem based learning) favours the achievement of certain generic competences (‘teamwork’ and ‘planning and time management’) and also whether this fact improved the academic performance of our students. The original approach of this work consists in using psychometric tests to measure the degree of acquired student’s generic competences instead of using opinion surveys, as usual. Results indicated that active learning methodologies improve the academic performance when compared to the traditional lecture/discussion method, according to the success rate obtained. These methods seem to have as well an effect on the teamwork competence (the perception of the behaviour of the other members in the group) but not on the perception of each students’ behaviour. Active learning does not produce any significant change in the generic competence ‘planning and time management'.
Resumo:
The new degrees in Spanish universities generated as a result of the Bologna process, stress a new dimension: the generic competencies to be acquired by university students (leadership, problem solving, respect for the environment, etc.). At Universidad Polite¿cnica de Madrid a teaching model was defined for two degrees: Graduate in Computer Engineering and Graduate in Software Engineering. Such model incorporates the training, development and assessment of generic competencies planned in these curricula. The aim of this paper is to describe how this model was implemented in both degrees. The model has three components. The first refers to a set of seven activities for introducing mechanisms for training, development and assessment of generic competencies. The second component aims to coordinate actions that implement the competencies across courses (in space and time). The third component consists of a series of activities to perform quality control. The implementation of generic competencies was carried out in first year courses (first and second semesters), together with the planning for second year courses (third and fourth semesters). We managed to involve a high percentage of first-year courses (80%) and the contacts that have been initiated suggest a high percentage in the second year as well.
Resumo:
It is essential to remotely and continuously monitor the movements of individuals in many social areas, for example, taking care of aging people, physical therapy, athletic training etc. Many methods have been used, such as video record, motion analysis or sensor-based methods. Due to the limitations in remote communication, power consumption, portability and so on, most of them are not able to fulfill the requirements. The development of wearable technology and cloud computing provides a new efficient way to achieve this goal. This paper presents an intelligent human movement monitoring system based on a smartwatch, an Android smartphone and a distributed data management engine. This system includes advantages of wide adaptability, remote and long-term monitoring capacity, high portability and flexibility. The structure of the system and its principle are introduced. Four experiments are designed to prove the feasibility of the system. The results of the experiments demonstrate the system is able to detect different actions of individuals with adequate accuracy.
Resumo:
Dynamic weighing systems based on load cells are commonly used to estimate crop yields in the field. There is lack of data, however, regarding the accuracy of such weighing systems mounted on harvesting machinery, especially on that used to collect high value crops such as fruits and vegetables. Certainly, dynamic weighing systems mounted on the bins of grape harvesters are affected by the displacement of the load inside the bin when moving over terrain of changing topography. In this work, the load that would be registered in a grape harvester bin by a dynamic weighing system based on the use of a load cell was inferred by using the discrete element method (DEM). DEM is a numerical technique capable of accurately describing the behaviour of granular materials under dynamic situations and it has been proven to provide successful predictions in many different scenarios. In this work, different DEM models of a grape harvester bin were developed contemplating different influencing factors. Results obtained from these models were used to infer the output given by the load cell of a real bin. The mass detected by the load cell when the bin was inclined depended strongly on the distribution of the load within the bin, but was underestimated in all scenarios. The distribution of the load was found to be dependent on the inclination of the bin caused by the topography of the terrain, but also by the history of inclination (inclination rate, presence of static periods, etc.) since the effect of the inertia of the particles (i.e., representing the grapes) was not negligible. Some recommendations are given to try to improve the accuracy of crop load measurement in the field.
Resumo:
There are many open issues that must be addressed before the replication process can be successfully formalized in empirical software engineering research. We define replication as the deliberate repetition of the same empirical study for the purpose of determining whether the results of the first experiment can be reproduced. This definition would appear at first glance to be good. However, it needs several clarifications that have not yet been forthcoming in software engineering: – What is the exact meaning of the same empirical study? Namely how similar should an experiment be to the baseline study for it to be considered a replication? What is the exact meaning of a result being reproduced? Namely how similar does a result have to be to the result of the baseline study for it to be considered reproduced? These and other methodological questions need to be researched and tailored for empirical software engineering.
Resumo:
In memoriam of Professor John Munro was born in Glasgow the 8th of July 1925 and died on Wednesday 27th February 1985. He graduated in Civil Engineering at Glasgow University in 1951
Resumo:
Context: Measurement is crucial and important to empirical software engineering. Although reliability and validity are two important properties warranting consideration in measurement processes, they may be influenced by random or systematic error (bias) depending on which metric is used. Aim: Check whether, the simple subjective metrics used in empirical software engineering studies are prone to bias. Method: Comparison of the reliability of a family of empirical studies on requirements elicitation that explore the same phenomenon using different design types and objective and subjective metrics. Results: The objectively measured variables (experience and knowledge) tend to achieve more reliable results, whereas subjective metrics using Likert scales (expertise and familiarity) tend to be influenced by systematic error or bias. Conclusions: Studies that predominantly use variables measured subjectively, like opinion polls or expert opinion acquisition.
Resumo:
En la presente investigación se analiza la causa del hundimiento del cuarto compartimento del Tercer Depósito del Canal de Isabel II el 8 de abril de 1905, uno de los más graves de la historia de la construcción en España: fallecieron 30 personas y quedaron heridas otras 60. El Proyecto y Construcción de esta estructura era de D. José Eugenio Ribera, una de las grandes figuras de la ingeniería civil en nuestro país, cuya carrera pudo haber quedado truncada como consecuencia del siniestro. Dado el tiempo transcurrido desde la ocurrencia de este accidente, la investigación ha partido de la recopilación de la información relativa al Proyecto y a la propia construcción de la estructura, para revisar a continuación la información disponible sobre el hundimiento. De la construcción de la cubierta es interesante destacar la atrevida configuración estructural, cubriéndose una inmensa superficie de 74.000 m2 mediante una sucesión de bóvedas de hormigón armado de tan sólo 5 cm de espesor y un rebajamiento de 1/10 para salvar una luz de 6 m, que apoyaban en pórticos del mismo material, con pilares también muy esbeltos: 0,25 m de lado para 8 m de altura. Y todo ello en una época en la que la tecnología y conocimiento de las estructuras con este "nuevo" material se basaban en buena medida en el desarrollo de patentes. En cuanto a la información sobre el hundimiento, llama la atención en primer lugar la relevancia de los técnicos, peritos y letrados que intervinieron en el juicio y en el procedimiento administrativo posterior, poniéndose de manifiesto la trascendencia que el accidente tuvo en su momento y que, sin embargo, no ha trascendido hasta nuestros días. Ejemplo de ello es el papel de Echegaray -primera figura intelectual de la época- como perito en la defensa de Ribera, de D. Melquiades Álvarez -futuro presidente del Congreso- como abogado defensor, el General Marvá -uno de los máximos exponentes del papel de los ingenieros militares en la introducción del hormigón armado en nuestro país-, que presidiría la Comisión encargada del peritaje por parte del juzgado, o las opiniones de reconocidas personalidades internacionales del "nuevo" material como el Dr. von Emperger o Hennebique. Pero lo más relevante de dicha información es la falta de uniformidad sobre lo que pudo ocasionar el hundimiento: fallos en los materiales, durante la construcción, defectos en el diseño de la estructura, la realización de unas pruebas de carga cuando se concluyó ésta, etc. Pero la que durante el juicio y en los Informes posteriores se impuso como causa del fallo de la estructura fue su dilatación como consecuencia de las altas temperaturas que se produjeron aquella primavera. Y ello a pesar de que el hundimiento ocurrió a las 7 de la mañana... Con base en esta información se ha analizado el comportamiento estructural de la cubierta, permitiendo evaluar el papel que diversos factores pudieron tener en el inicio del hundimiento y en su extensión a toda la superficie construida, concluyéndose así cuáles fueron las causas del siniestro. De los resultados obtenidos se presta especial atención a las enseñanzas que se desprenden de la ocurrencia del hundimiento, enfatizándose en la relevancia de la historia -y en particular de los casos históricos de error- para la formación continua que debe existir en la Ingeniería. En el caso del hundimiento del Tercer Depósito algunas de estas "enseñanzas" son de plena actualidad, tales como la importancia de los detalles constructivos en la "robustez" de la estructuras, el diseño de estructuras "integrales" o la vigilancia del proceso constructivo. Por último, la investigación ha servido para recuperar, una vez más, la figura de D. José Eugenio Ribera, cuyo papel en la introducción del hormigón armado en España fue decisivo. En la obra del Tercer Depósito se arriesgó demasiado, y provocó un desastre que aceleró la transición hacia una nueva etapa en el hormigón estructural al abrigo de un mayor conocimiento científico y de las primeras normativas. También en esta etapa sería protagonista. This dissertation analyses the cause of the collapse of the 4th compartment of the 3th Reservoir of Canal de Isabel II in Madrid. It happened in 1905, on April 8th, being one of the most disastrous accidents occurred in the history of Spanish construction: 30 people died and 60 were injured. The design and construction supervision were carried out by D. José Eugenio Ribera, one of the main figures in Civil Engineering of our country, whose career could have been destroyed as a result of this accident. Since it occurred more than 100 years ago, the investigation started by compiling information about the structure`s design and construction, followed by reviewing the available information about the accident. With regard to the construction, it is interesting to point out its daring structural configuration. It covered a huge area of 74.000 m2 with a series of reinforced concrete vaults with a thickness of not more than 5 cm, a 6 m span and a rise of 1/10th. In turn, these vaults were supported by frames composed of very slender 0,25 m x 0,25 m columns with a height of 8 m. It is noteworthy that this took place in a time when the technology and knowledge about this "new" material was largely based on patents. In relation to the information about the collapse, its significance is shown by the important experts and lawyers that were involved in the trial and the subsequent administrative procedure. For example, Echegaray -the most important intellectual of that time- defended Ribera, Melquiades Álvarez –the future president of the Congress- was his lawyer, and General Marvá -who represented the important role of the military engineers in the introduction of reinforced concrete in our country-, led the Commission that was put in charge by the judge of the root cause analysis. In addition, the matter caught the interest of renowned foreigners like Dr. von Emperger or Hennebique and their opinions had a great influence. Nonetheless, this structural failure is unknown to most of today’s engineers. However, what is most surprising are the different causes that were claimed to lie at the root of the disaster: material defects, construction flaws, errors in the design, load tests performed after the structure was finished, etc. The final cause that was put forth during the trial and in the following reports was attributed to the dilatation of the roof due to the high temperatures that spring, albeit the collapse occurred at 7 AM... Based on this information the structural behaviour of the roof has been analysed, which allowed identifying the causes that could have provoked the initial failure and those that could have led to the global collapse. Lessons have been learned from these results, which points out the relevance of history -and in particular, of examples gone wrong- for the continuous education that should exist in engineering. In the case of the 3th Reservoir some of these lessons are still relevant during the present time, like the importance of detailing in "robustness", the design of "integral" structures or the due consideration of construction methods. Finally, the investigation has revived, once again, the figure of D. José Eugenio Ribera, whose role in the introduction of reinforced concrete in Spain was crucial. With the construction of the 3th Reservoir he took too much risk and caused a disaster that accelerated the transition to a new era in structural concrete based on greater scientific knowledge and the first codes. In this new period he would also play a major role.
Resumo:
In a Finite Element (FE) analysis of elastic solids several items are usually considered, namely, type and shape of the elements, number of nodes per element, node positions, FE mesh, total number of degrees of freedom (dot) among others. In this paper a method to improve a given FE mesh used for a particular analysis is described. For the improvement criterion different objective functions have been chosen (Total potential energy and Average quadratic error) and the number of nodes and dof's of the new mesh remain constant and equal to the initial FE mesh. In order to find the mesh producing the minimum of the selected objective function the steepest descent gradient technique has been applied as optimization algorithm. However this efficient technique has the drawback that demands a large computation power. Extensive application of this methodology to different 2-D elasticity problems leads to the conclusion that isometric isostatic meshes (ii-meshes) produce better results than the standard reasonably initial regular meshes used in practice. This conclusion seems to be independent on the objective function used for comparison. These ii-meshes are obtained by placing FE nodes along the isostatic lines, i.e. curves tangent at each point to the principal direction lines of the elastic problem to be solved and they should be regularly spaced in order to build regular elements. That means ii-meshes are usually obtained by iteration, i.e. with the initial FE mesh the elastic analysis is carried out. By using the obtained results of this analysis the net of isostatic lines can be drawn and in a first trial an ii-mesh can be built. This first ii-mesh can be improved, if it necessary, by analyzing again the problem and generate after the FE analysis the new and improved ii-mesh. Typically, after two first tentative ii-meshes it is sufficient to produce good FE results from the elastic analysis. Several example of this procedure are presented.
Resumo:
A protein semisynthesis method—expressed protein ligation—is described that involves the chemoselective addition of a peptide to a recombinant protein. This method was used to ligate a phosphotyrosine peptide to the C terminus of the protein tyrosine kinase C-terminal Src kinase (Csk). By intercepting a thioester generated in the recombinant protein with an N-terminal cysteine containing synthetic peptide, near quantitative chemical ligation of the peptide to the protein was achieved. The semisynthetic tail-phosphorylated Csk showed evidence of an intramolecular phosphotyrosine-Src homology 2 interaction and an unexpected increase in catalytic phosphoryl transfer efficiency toward a physiologically relevant substrate compared with the non-tail-phosphorylated control. This work illustrates that expressed protein ligation is a simple and powerful new method in protein engineering to introduce sequences of unnatural amino acids, posttranslational modifications, and biophysical probes into proteins of any size.
Resumo:
Presentación oral SPIE Photonics Europe, Brussels, 16-19 April 2012.
Resumo:
An empirical model based on constant flux is presented for chloride transport through concrete in atmospherical exposure conditions. A continuous supply of chlorides is assumed as a constant mass flux at the exposed concrete surface. The model is applied to experimental chloride profiles obtained from a real marine structure, and results are compared with the classical error-function model. The proposed model shows some advantages. It yields a better predictive capacity than the classical error-function model. The previously observed chloride surface concentration increases are compatible with the proposed model. Nevertheless, the predictive capacity of the model can fail if the concrete microstructure changes with time. The model seems to be appropriate for well-maturated concretes exposed to a marine environment in atmospherical conditions.
Resumo:
In this study, the behavior of bituminous mixes made with sewage sludge ash (SSA) as mineral filler was investigated. The behavior of these mixes was evaluated with the Cantabro, indirect tensile strength, water sensitivity, permanent deformation, and resilient modulus tests. The results show that SSA waste may be used in bituminous mixes at approximately 2–3% weight percent, maintaining adequate levels of cohesion and adhesion in the mixtures, which is comparable to mixtures made with active fillers such as hydrated lime and cement. Moreover, its use does not increase permanent deformations. However, the resilient modulus test gave slightly lower results for mixes made with SSA than for mixtures made with other fillers. It may be concluded that SSA waste may be used as a filler for bituminous mixes with better results than for mixes made with limestone fillers and with similar results for mixes made with other fillers such as hydrated lime and cement.