421 resultados para initialisation flaws


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much of the knowledge about software systems is implicit, and therefore difficult to recover by purely automated techniques. Architectural layers and the externally visible features of software systems are two examples of information that can be difficult to detect from source code alone, and that would benefit from additional human knowledge. Typical approaches to reasoning about data involve encoding an explicit meta-model and expressing analyses at that level. Due to its informal nature, however, human knowledge can be difficult to characterize up-front and integrate into such a meta-model. We propose a generic, annotation-based approach to capture such knowledge during the reverse engineering process. Annotation types can be iteratively defined, refined and transformed, without requiring a fixed meta-model to be defined in advance. We show how our approach supports reverse engineering by implementing it in a tool called Metanool and by applying it to (i) analyzing architectural layering, (ii) tracking reengineering tasks, (iii) detecting design flaws, and (iv) analyzing features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Temporal data are a core element of a reservation. In this paper we formulate 10 requirements and 14 sub-requirements for handling temporal data in online hotel reservation systems (OHRS) from a usability viewpoint. We test the fulfillment of these requirements for city and resort hotels in Austria and Switzerland. Some of the requirements are widely met; however, many requirements are fulfilled only by a surprisingly small number of hotels. In particular, numerous systems offer options for selecting data which lead to error messages in the next step. A few screenshots illustrate flaws of the systems. We also draw conclusions on the state of applying software engineering principles in the development of Web pages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The UNESCO Convention on cultural diversity marks a wilful separation between the issues of trade and culture on the international level. The present article explores this intensified institutional, policy- and decision-making disconnect and exposes its flaws and the considerable drawbacks it brings with it. These drawbacks, the article argues, become particularly pronounced in the digital media environment that has impacted upon both the conditions of trade with cultural products and services and upon the diversity of cultural expressions in local and global contexts. Criticising the strong and now increasingly meaningless path dependencies of the analogue age, the article sketches some possible ways to reconciling trade and culture, most of which lead back to the WTO, rather than to UNESCO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article provides an overview of the most essential issues in the trade and culture discourse from a global law perspective. It looks into the intensified disconnect between trade and culture and exposes its flaws and the considerable drawbacks that it brings with it. It is argued that these drawbacks become especially pronounced in the digital media environment, which has strongly affected both the conditions of trade with cultural products and services and cultural diversity in local and global contexts. In this modified setting, there could have been a number of feasible ‘trade and culture’ solutions – i.e. regulatory designs that whilst enhancing trade liberalisation are also conducive to cultural policy. Yet, the realisation of any of these options becomes chimerical as the line between trade and culture matters is drawn in a clear and resolute manner. The article is meant for an interdisciplinary audience and forthcoming in the Journal of Arts Management, Law and Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The clinical diagnosis 'erosion' is made from characteristic deviations from the original anatomical tooth morphology, thus distinguishing acid-induced tissue loss from other forms of wear. Primary pathognomonic features are shallow concavities on smooth surfaces occurring coronal from the enamel-cementum junction. Problems from diagnosing occlusal surfaces and exposed dentine are discussed. Indices for recording erosive wear include morphological as well as quantitative criteria. Currently, various indices are used, each having their virtues and flaws, making the comparison of prevalence studies difficult. The Basic Erosive Wear Examination (BEWE) is described, which is intended to provide an easy tool for research as well as for use in general dental practice. The cumulative score of this index is the sum of the most severe scores obtained from all sextants and is linked to suggestions for clinical management. In addition to recording erosive lesions, the assessment of progression is important as the indication of treatment measures depends on erosion activity. A number of evaluated and sensitive methods for in vitro and in situ approaches are available, but the fundamental problem for their clinical use is the lack of reidentifiable reference areas. Tools for clinical monitoring are described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Cardiac and thoracic surgery are associated with an increased risk of venous thromboembolism (VTE). The safety and efficacy of primary thromboprophylaxis in patients undergoing these types of surgery is uncertain. OBJECTIVES To assess the effects of primary thromboprophylaxis on the incidence of symptomatic VTE and major bleeding in patients undergoing cardiac or thoracic surgery. SEARCH METHODS The Cochrane Peripheral Vascular Diseases Group Trials Search Co-ordinator searched the Specialised Register (last searched May 2014) and CENTRAL (2014, Issue 4). The authors searched the reference lists of relevant studies, conference proceedings, and clinical trial registries. SELECTION CRITERIA Randomised controlled trials (RCTs) and quasi-RCTs comparing any oral or parenteral anticoagulant or mechanical intervention to no intervention or placebo, or comparing two different anticoagulants. DATA COLLECTION AND ANALYSIS We extracted data on methodological quality, participant characteristics, interventions, and outcomes including symptomatic VTE and major bleeding as the primary effectiveness and safety outcomes, respectively. MAIN RESULTS We identified 12 RCTs and one quasi-RCT (6923 participants), six for cardiac surgery (3359 participants) and seven for thoracic surgery (3564 participants). No study evaluated fondaparinux, the new oral direct thrombin, direct factor Xa inhibitors, or caval filters. All studies had major study design flaws and most lacked a placebo or no treatment control group. We typically graded the quality of the overall body of evidence for the various outcomes and comparisons as low, due to imprecise estimates of effect and risk of bias. We could not pool data because of the different comparisons and the lack of data. In cardiac surgery, 71 symptomatic VTEs occurred in 3040 participants from four studies. In a study of 2551 participants, representing 85% of the review population in cardiac surgery, the combination of unfractionated heparin with pneumatic compression stockings was associated with a 61% reduction of symptomatic VTE compared to unfractionated heparin alone (1.5% versus 4.0%; risk ratio (RR) 0.39; 95% confidence interval (CI) 0.23 to 0.64). Major bleeding was only reported in one study, which found a higher incidence with vitamin K antagonists compared to platelet inhibitors (11.3% versus 1.6%, RR 7.06; 95% CI 1.64 to 30.40). In thoracic surgery, 15 symptomatic VTEs occurred in 2890 participants from six studies. In the largest study evaluating unfractionated heparin versus an inactive control the rates of symptomatic VTE were 0.7% versus 0%, respectively, giving a RR of 6.71 (95% CI 0.40 to 112.65). There was insufficient evidence to determine if there was a difference in the risk of major bleeding from two studies evaluating fixed-dose versus weight-adjusted low molecular weight heparin (2.7% versus 8.1%, RR 0.33; 95% CI 0.07 to 1.60) and unfractionated heparin versus low molecular weight heparin (6% and 4%, RR 1.50; 95% CI 0.26 to 8.60). AUTHORS' CONCLUSIONS The evidence regarding the efficacy and safety of thromboprophylaxis in cardiac and thoracic surgery is limited. Data for important outcomes such as pulmonary embolism or major bleeding were often lacking. Given the uncertainties around the benefit-to-risk balance, no conclusions can be drawn and a case-by-case risk evaluation of VTE and bleeding remains preferable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biotechnology refers to the broad set of techniques that allow genetic manipulation of organisms. The techniques of biotechnology have broad implications for many industries, however it promises the greatest innovations in the production of products regulated by the Food and Drug Administration (FDA). Like many other powerful new technologies, biotechnology may carry risks as well as benefits. Several of its applications have engendered fervent emotional reactions and raised serious ethical concerns, especially internationally. ^ First, in my paper I discuss the historical and technical background of biotechnology. Second, I examine the development of biotechnology in Europe, the citizens' response to genetically modified (“GM”) foods and the governments' response. Third, I examine the regulation of bioengineered products and foods in the United States. ^ In conclusion, there are various problems with the current status of regulation of GM foods in the United States. These are four basic flaws: (1) the Coordinated Framework allows for too much jurisdictional overlap of biotechnological foods, (2) GM foods are considered GRAS and consequently, are placed on the market without pre-market approval, (3) federal mandatory labeling of GM foods cannot occur until the question of whether or not nondisclosure of a genetic engineering production processes is misleading or material information and (4) an independent state-labeling scheme of GM foods will most likely impede interstate commerce. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In December, 1980, following increasing congressional and constituent-interest in problems associated with hazardous waste, the Comprehensive Environmental Recovery, Compensation and Liability Act (CERCLA) was passed. During its development, the legislative initiative was seriously compromised which resulted in a less exhaustive approach than was formerly sought. Still, CERCLA (Superfund) which established, among other things, authority to clean up abandoned waste dumps and to respond to emergencies caused by releases of hazardous substances was welcomed by many as an important initial law critical to the cleanup of the nation's hazardous waste. Expectations raised by passage of this bill were tragically unmet. By the end of four years, only six sites had been declared by the EPA as cleaned. Seemingly, even those determinations were liberal; of the six sites, two were identified subsequently as requiring further cleanup.^ This analysis is focused upon the implementation failure of the Superfund. In light of that focus, discussion encompasses development of linkages between flaws in the legislative language and foreclosure of chances for implementation success. Specification of such linkages is achieved through examination of the legislative initiative, identification of its flaws and characterization of attendant deficits in implementation ability. Subsequent analysis is addressed to how such legislative frailities might have been avoided and to attendant regulatory weaknesses which have contributed to implementation failure. Each of these analyses are accomplished through application of an expanded approach to the backward mapping analytic technique as presented by Elmore. Results and recommendations follow.^ Consideration is devoted to a variety of regulatory issues as well as to those pertinent to legislative and implementation analysis. Problems in assessing legal liability associated with hazardous waste management are presented, as is a detailed review of the legislative development of Superfund, and its initial implementation by Gorsuch's EPA. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El proceso de reforma del Estado experimentado en la Argentina en la década de los 90, significó tanto para las instituciones como para los ciudadanos un cambio profundo en las relaciones que hasta el momento se habían dado entre sociedad y Estado. Los principales instrumentos de este proceso: la desregulación económica, la privatización de servicios y la descentralización de funciones, significaron en su conjunto un cambio sustancial en el papel regulador del Estado, quien deja de ser el prestador, y el corrector de las fallas del mercado, para constituirse ( real o potencialmente) en el garante de su cabal funcionamiento .Sin embargo en el caso de la transferencia de los servicios públicos al sector privado en la Argentina, al haber sido el Estado, el anterior prestador monopólico de los mismos, dejó un vacío que requirió del mismo no ya desregulación sino un nuevo papel regulador, respecto de los prestadores y los usuarios de dichos servicios. En este trabajo se efectúa una descripción de los enfoques económicos y jurídicos desde los cuales se plantea la función regulatoria del Estado a partir del conjunto de leyes, decretos, resoluciones y reglamentaciones del gobierno nacional que significaron cambios importantes en el papel asignado al Estado por el Gobierno en ejercicio durante la última década.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Melt pond covered sea ice is a ubiquitous feature of the summertime Arctic Ocean when meltwater collects in lower-lying areas of ice surfaces. Horizontal transects were conducted during June 2008 above and below landfast sea ice with melt ponds to characterize surface and bottom topography together with variations in transmitted spectral irradiance. We captured a rapid progression from a highly flooded sea ice surface with lateral drainage toward flaws and seal breathing holes to the formation of distinct melt ponds with steep edges. As the mass of the ice cover decreased due to meltwater drainage and rose upward with respect to the seawater level, the high-scattering properties of ice above the water level (i.e., white ice) were continuously regenerated, while pond waters remained transparent compared to underlying ice. The relatively stable albedos observed throughout the study, even as ice thickness decreased, were directly related to these surface processes. Transmission through the ice cover of incident irradiance in the 400-700 nm wave band ranged from 38% to 67% and from 5% to 16% beneath ponded and white ice, respectively. Our results show that this transmission varied not only as a function of surface type (melt ponds or white ice) areal coverage but also in relation to ice thickness and proximity to other surface types through the influence of horizontal spreading of light. Thus, in contrast to albedo, this implies that regional transmittance estimates need to consider melt pond size and shape distributions and variations in optical properties and thickness of the ice cover.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The extraordinary increase of new information technologies, the development of Internet, the electronic commerce, the e-government, mobile telephony and future cloud computing and storage, have provided great benefits in all areas of society. Besides these, there are new challenges for the protection of information, such as the loss of confidentiality and integrity of electronic documents. Cryptography plays a key role by providing the necessary tools to ensure the safety of these new media. It is imperative to intensify the research in this area, to meet the growing demand for new secure cryptographic techniques. The theory of chaotic nonlinear dynamical systems and the theory of cryptography give rise to the chaotic cryptography, which is the field of study of this thesis. The link between cryptography and chaotic systems is still subject of intense study. The combination of apparently stochastic behavior, the properties of sensitivity to initial conditions and parameters, ergodicity, mixing, and the fact that periodic points are dense, suggests that chaotic orbits resemble random sequences. This fact, and the ability to synchronize multiple chaotic systems, initially described by Pecora and Carroll, has generated an avalanche of research papers that relate cryptography and chaos. The chaotic cryptography addresses two fundamental design paradigms. In the first paradigm, chaotic cryptosystems are designed using continuous time, mainly based on chaotic synchronization techniques; they are implemented with analog circuits or by computer simulation. In the second paradigm, chaotic cryptosystems are constructed using discrete time and generally do not depend on chaos synchronization techniques. The contributions in this thesis involve three aspects about chaotic cryptography. The first one is a theoretical analysis of the geometric properties of some of the most employed chaotic attractors for the design of chaotic cryptosystems. The second one is the cryptanalysis of continuos chaotic cryptosystems and finally concludes with three new designs of cryptographically secure chaotic pseudorandom generators. The main accomplishments contained in this thesis are: v Development of a method for determining the parameters of some double scroll chaotic systems, including Lorenz system and Chua’s circuit. First, some geometrical characteristics of chaotic system have been used to reduce the search space of parameters. Next, a scheme based on the synchronization of chaotic systems was built. The geometric properties have been employed as matching criterion, to determine the values of the parameters with the desired accuracy. The method is not affected by a moderate amount of noise in the waveform. The proposed method has been applied to find security flaws in the continuous chaotic encryption systems. Based on previous results, the chaotic ciphers proposed by Wang and Bu and those proposed by Xu and Li are cryptanalyzed. We propose some solutions to improve the cryptosystems, although very limited because these systems are not suitable for use in cryptography. Development of a method for determining the parameters of the Lorenz system, when it is used in the design of two-channel cryptosystem. The method uses the geometric properties of the Lorenz system. The search space of parameters has been reduced. Next, the parameters have been accurately determined from the ciphertext. The method has been applied to cryptanalysis of an encryption scheme proposed by Jiang. In 2005, Gunay et al. proposed a chaotic encryption system based on a cellular neural network implementation of Chua’s circuit. This scheme has been cryptanalyzed. Some gaps in security design have been identified. Based on the theoretical results of digital chaotic systems and cryptanalysis of several chaotic ciphers recently proposed, a family of pseudorandom generators has been designed using finite precision. The design is based on the coupling of several piecewise linear chaotic maps. Based on the above results a new family of chaotic pseudorandom generators named Trident has been designed. These generators have been specially designed to meet the needs of real-time encryption of mobile technology. According to the above results, this thesis proposes another family of pseudorandom generators called Trifork. These generators are based on a combination of perturbed Lagged Fibonacci generators. This family of generators is cryptographically secure and suitable for use in real-time encryption. Detailed analysis shows that the proposed pseudorandom generator can provide fast encryption speed and a high level of security, at the same time. El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de Internet, el comercio electrónico, la administración electrónica, la telefonía móvil y la futura computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección de la información, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos electrónicos. La criptografía juega un papel fundamental aportando las herramientas necesarias para garantizar la seguridad de estos nuevos medios, pero es imperativo intensificar la investigación en este ámbito para dar respuesta a la demanda creciente de nuevas técnicas criptográficas seguras. La teoría de los sistemas dinámicos no lineales junto a la criptografía dan lugar a la ((criptografía caótica)), que es el campo de estudio de esta tesis. El vínculo entre la criptografía y los sistemas caóticos continúa siendo objeto de un intenso estudio. La combinación del comportamiento aparentemente estocástico, las propiedades de sensibilidad a las condiciones iniciales y a los parámetros, la ergodicidad, la mezcla, y que los puntos periódicos sean densos asemejan las órbitas caóticas a secuencias aleatorias, lo que supone su potencial utilización en el enmascaramiento de mensajes. Este hecho, junto a la posibilidad de sincronizar varios sistemas caóticos descrita inicialmente en los trabajos de Pecora y Carroll, ha generado una avalancha de trabajos de investigación donde se plantean muchas ideas sobre la forma de realizar sistemas de comunicaciones seguros, relacionando así la criptografía y el caos. La criptografía caótica aborda dos paradigmas de diseño fundamentales. En el primero, los criptosistemas caóticos se diseñan utilizando circuitos analógicos, principalmente basados en las técnicas de sincronización caótica; en el segundo, los criptosistemas caóticos se construyen en circuitos discretos u ordenadores, y generalmente no dependen de las técnicas de sincronización del caos. Nuestra contribución en esta tesis implica tres aspectos sobre el cifrado caótico. En primer lugar, se realiza un análisis teórico de las propiedades geométricas de algunos de los sistemas caóticos más empleados en el diseño de criptosistemas caóticos vii continuos; en segundo lugar, se realiza el criptoanálisis de cifrados caóticos continuos basados en el análisis anterior; y, finalmente, se realizan tres nuevas propuestas de diseño de generadores de secuencias pseudoaleatorias criptográficamente seguros y rápidos. La primera parte de esta memoria realiza un análisis crítico acerca de la seguridad de los criptosistemas caóticos, llegando a la conclusión de que la gran mayoría de los algoritmos de cifrado caóticos continuos —ya sean realizados físicamente o programados numéricamente— tienen serios inconvenientes para proteger la confidencialidad de la información ya que son inseguros e ineficientes. Asimismo una gran parte de los criptosistemas caóticos discretos propuestos se consideran inseguros y otros no han sido atacados por lo que se considera necesario más trabajo de criptoanálisis. Esta parte concluye señalando las principales debilidades encontradas en los criptosistemas analizados y algunas recomendaciones para su mejora. En la segunda parte se diseña un método de criptoanálisis que permite la identificaci ón de los parámetros, que en general forman parte de la clave, de algoritmos de cifrado basados en sistemas caóticos de Lorenz y similares, que utilizan los esquemas de sincronización excitador-respuesta. Este método se basa en algunas características geométricas del atractor de Lorenz. El método diseñado se ha empleado para criptoanalizar eficientemente tres algoritmos de cifrado. Finalmente se realiza el criptoanálisis de otros dos esquemas de cifrado propuestos recientemente. La tercera parte de la tesis abarca el diseño de generadores de secuencias pseudoaleatorias criptográficamente seguras, basadas en aplicaciones caóticas, realizando las pruebas estadísticas, que corroboran las propiedades de aleatoriedad. Estos generadores pueden ser utilizados en el desarrollo de sistemas de cifrado en flujo y para cubrir las necesidades del cifrado en tiempo real. Una cuestión importante en el diseño de sistemas de cifrado discreto caótico es la degradación dinámica debida a la precisión finita; sin embargo, la mayoría de los diseñadores de sistemas de cifrado discreto caótico no ha considerado seriamente este aspecto. En esta tesis se hace hincapié en la importancia de esta cuestión y se contribuye a su esclarecimiento con algunas consideraciones iniciales. Ya que las cuestiones teóricas sobre la dinámica de la degradación de los sistemas caóticos digitales no ha sido totalmente resuelta, en este trabajo utilizamos algunas soluciones prácticas para evitar esta dificultad teórica. Entre las técnicas posibles, se proponen y evalúan varias soluciones, como operaciones de rotación de bits y desplazamiento de bits, que combinadas con la variación dinámica de parámetros y con la perturbación cruzada, proporcionan un excelente remedio al problema de la degradación dinámica. Además de los problemas de seguridad sobre la degradación dinámica, muchos criptosistemas se rompen debido a su diseño descuidado, no a causa de los defectos esenciales de los sistemas caóticos digitales. Este hecho se ha tomado en cuenta en esta tesis y se ha logrado el diseño de generadores pseudoaleatorios caóticos criptogr áficamente seguros.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Very recently (Banerjee et al. in Astrophys. Space, doi:1007/s10509-011-0836-1, 2011) the statistics of geomagnetic Disturbance storm (Dst) index have been addressed, and the conclusion from this analysis suggests that the underlying dynamical process can be modeled as a fractional Brownian motion with persistent long-range correlations. In this comment we expose several misconceptions and flaws in the statistical analysis of that work. On the basis of these arguments, the former conclusion should be revisited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tanto los robots autónomos móviles como los robots móviles remotamente operados se utilizan con éxito actualmente en un gran número de ámbitos, algunos de los cuales son tan dispares como la limpieza en el hogar, movimiento de productos en almacenes o la exploración espacial. Sin embargo, es difícil garantizar la ausencia de defectos en los programas que controlan dichos dispositivos, al igual que ocurre en otros sectores informáticos. Existen diferentes alternativas para medir la calidad de un sistema en el desempeño de las funciones para las que fue diseñado, siendo una de ellas la fiabilidad. En el caso de la mayoría de los sistemas físicos se detecta una degradación en la fiabilidad a medida que el sistema envejece. Esto es debido generalmente a efectos de desgaste. En el caso de los sistemas software esto no suele ocurrir, ya que los defectos que existen en ellos generalmente no han sido adquiridos con el paso del tiempo, sino que han sido insertados en el proceso de desarrollo de los mismos. Si dentro del proceso de generación de un sistema software se focaliza la atención en la etapa de codificación, podría plantearse un estudio que tratara de determinar la fiabilidad de distintos algoritmos, válidos para desempeñar el mismo cometido, según los posibles defectos que pudieran introducir los programadores. Este estudio básico podría tener diferentes aplicaciones, como por ejemplo elegir el algoritmo menos sensible a los defectos, para el desarrollo de un sistema crítico o establecer procedimientos de verificación y validación, más exigentes, si existe la necesidad de utilizar un algoritmo que tenga una alta sensibilidad a los defectos. En el presente trabajo de investigación se ha estudiado la influencia que tienen determinados tipos de defectos software en la fiabilidad de tres controladores de velocidad multivariable (PID, Fuzzy y LQR) al actuar en un robot móvil específico. La hipótesis planteada es que los controladores estudiados ofrecen distinta fiabilidad al verse afectados por similares patrones de defectos, lo cual ha sido confirmado por los resultados obtenidos. Desde el punto de vista de la planificación experimental, en primer lugar se realizaron los ensayos necesarios para determinar si los controladores de una misma familia (PID, Fuzzy o LQR) ofrecían una fiabilidad similar, bajo las mismas condiciones experimentales. Una vez confirmado este extremo, se eligió de forma aleatoria un representante de clase de cada familia de controladores, para efectuar una batería de pruebas más exhaustiva, con el objeto de obtener datos que permitieran comparar de una forma más completa la fiabilidad de los controladores bajo estudio. Ante la imposibilidad de realizar un elevado número de pruebas con un robot real, así como para evitar daños en un dispositivo que generalmente tiene un coste significativo, ha sido necesario construir un simulador multicomputador del robot. Dicho simulador ha sido utilizado tanto en las actividades de obtención de controladores bien ajustados, como en la realización de los diferentes ensayos necesarios para el experimento de fiabilidad. ABSTRACT Autonomous mobile robots and remotely operated robots are used successfully in very diverse scenarios, such as home cleaning, movement of goods in warehouses or space exploration. However, it is difficult to ensure the absence of defects in programs controlling these devices, as it happens in most computer sectors. There exist different quality measures of a system when performing the functions for which it was designed, among them, reliability. For most physical systems, a degradation occurs as the system ages. This is generally due to the wear effect. In software systems, this does not usually happen, and defects often come from system development and not from use. Let us assume that we focus on the coding stage in the software development pro¬cess. We could consider a study to find out the reliability of different and equally valid algorithms, taking into account any flaws that programmers may introduce. This basic study may have several applications, such as choosing the algorithm less sensitive to pro¬gramming defects for the development of a critical system. We could also establish more demanding procedures for verification and validation if we need an algorithm with high sensitivity to programming defects. In this thesis, we studied the influence of certain types of software defects in the reliability of three multivariable speed controllers (PID, Fuzzy and LQR) designed to work in a specific mobile robot. The hypothesis is that similar defect patterns affect differently the reliability of controllers, and it has been confirmed by the results. From the viewpoint of experimental planning, we followed these steps. First, we conducted the necessary test to determine if controllers of the same family (PID, Fuzzy or LQR) offered a similar reliability under the same experimental conditions. Then, a class representative was chosen at ramdom within each controller family to perform a more comprehensive test set, with the purpose of getting data to compare more extensively the reliability of the controllers under study. The impossibility of performing a large number of tests with a real robot and the need to prevent the damage of a device with a significant cost, lead us to construct a multicomputer robot simulator. This simulator has been used to obtain well adjusted controllers and to carry out the required reliability experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents some preliminary results of an ongoing research intended to qualify a highly resistant duplex stainless steel wire as prestressing steel and, gets on insight on (he wires' fracture micromechanism and residual stresses field. SEM fractographic analysis of the stainless steel wires indicates an anisotropic fracture behavior in tension, in presence of surface flaws, attributed to the residual stresses generated through the fabrication process. The residual stresses magnitude influences the damage tolerance, its knowledge being a key issue in designating/qualifying the wires as prestressing steels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents some preliminary results of an ongoing research intended to qualify a highly resistant duplex stainless steel wire as prestressing steel and, gets on insight on (he wires' fracture micromechanism and residual stresses field. SEM fractographic analysis of the stainless steel wires indicates an anisotropic fracture behavior in tension, in presence of surface flaws, attributed to the residual stresses generated through the fabrication process. The residual stresses magnitude influences the damage tolerance, its knowledge being a key issue in designating/qualifying the wires as prestressing steels.