908 resultados para Verification and validation technology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

To assist cattle producers transition from microsatellite (MS) to single nucleotide polymorphism (SNP) genotyping for parental verification we previously devised an effective and inexpensive method to impute MS alleles from SNP haplotypes. While the reported method was verified with only a limited data set (N = 479) from Brown Swiss, Guernsey, Holstein, and Jersey cattle, some of the MS-SNP haplotype associations were concordant across these phylogenetically diverse breeds. This implied that some haplotypes predate modern breed formation and remain in strong linkage disequilibrium. To expand the utility of MS allele imputation across breeds, MS and SNP data from more than 8000 animals representing 39 breeds (Bos taurus and B. indicus) were used to predict 9410 SNP haplotypes, incorporating an average of 73 SNPs per haplotype, for which alleles from 12 MS markers could be accurately be imputed. Approximately 25% of the MS-SNP haplotypes were present in multiple breeds (N = 2 to 36 breeds). These shared haplotypes allowed for MS imputation in breeds that were not represented in the reference population with only a small increase in Mendelian inheritance inconsistancies. Our reported reference haplotypes can be used for any cattle breed and the reported methods can be applied to any species to aid the transition from MS to SNP genetic markers. While ~91% of the animals with imputed alleles for 12 MS markers had ≤1 Mendelian inheritance conflicts with their parents' reported MS genotypes, this figure was 96% for our reference animals, indicating potential errors in the reported MS genotypes. The workflow we suggest autocorrects for genotyping errors and rare haplotypes, by MS genotyping animals whose imputed MS alleles fail parentage verification, and then incorporating those animals into the reference dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Darunavir (DRV) is a protease inhibitor used in the treatment of HIV infection, which constitutes a keystone in the therapy of patients infected with this virus. There is no monograph described in official compendia. The literature provides few methods of analysis for the determination of DRV in pharmaceuticals which include TLC, IR, UPLC, HPLC, HPLC-MS, HPLC-MS/MS, but there are no reports of the use of capillary electrophoresis (CE) for the determination of this drug. Thus, this research proposed the development and validation of a CE method for the determination of DRV in tablets. The method was completely validated according to the International Conference on Harmonization guidelines, showing linearity, selectivity, precision, accuracy and robustness. The migration was achieved in less than 1 minute using fused-silica uncoated capillary with an id of 50 μm and total length of 21 cm and voltage of +20 kV. The sample injection was performed in the hydrodynamic mode. The method was linear over the concentration range of 50-200 μg mL-1 with correlation coefficient 0.9998 and limits of detection and quantification of 7.29 and 22.09 μg mL-1, respectively. The drug was subjected to acid, base, oxidation and photolysis degradation. Degradation products were found interfering with the assay of DRV, therefore the method can be regarded as stability indicating. The validated method is useful and appropriate for the routine quality control of DRV in tablets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aimed to apply genetic algorithms (GA) and particle swarm optimization (PSO) in cash balance management using Miller-Orr model, which consists in a stochastic model that does not define a single ideal point for cash balance, but an oscillation range between a lower bound, an ideal balance and an upper bound. Thus, this paper proposes the application of GA and PSO to minimize the Total Cost of cash maintenance, obtaining the parameter of the lower bound of the Miller-Orr model, using for this the assumptions presented in literature. Computational experiments were applied in the development and validation of the models. The results indicated that both the GA and PSO are applicable in determining the cash level from the lower limit, with best results of PSO model, which had not yet been applied in this type of problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]This work presents the calibration and validation of an air quality finite element model applied to emissions from a thermal power plant located in Gran Canaria. The calibration is performed using genetic algorithms. To calibrate and validate the model, the authors use empirical measures of pollutants concentrations from 4 stations located nearby the power plant; an hourly record per station during 3 days is available. Measures from 3 stations will be used to calibrate, while validation will use measures from the remaining station…

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ancient pavements are composed of a variety of preparatory or foundation layers constituting the substrate, and of a layer of tesserae, pebbles or marble slabs forming the surface of the floor. In other cases, the surface consists of a mortar layer beaten and polished. The term mosaic is associated with the presence of tesserae or pebbles, while the more general term pavement is used in all the cases. As past and modern excavations of ancient pavements demonstrated, all pavements do not necessarily display the stratigraphy of the substrate described in the ancient literary sources. In fact, the number and thickness of the preparatory layers, as well as the nature and the properties of their constituent materials, are often varying in pavements which are placed either in different sites or in different buildings within a same site or even in a same building. For such a reason, an investigation that takes account of the whole structure of the pavement is important when studying the archaeological context of the site where it is placed, when designing materials to be used for its maintenance and restoration, when documenting it and when presenting it to public. Five case studies represented by archaeological sites containing floor mosaics and other kind of pavements, dated to the Hellenistic and the Roman period, have been investigated by means of in situ and laboratory analyses. The results indicated that the characteristics of the studied pavements, namely the number and the thickness of the preparatory layers, and the properties of the mortars constituting them, vary according to the ancient use of the room where the pavements are placed and to the type of surface upon which they were built. The study contributed to the understanding of the function and the technology of the pavements’ substrate and to the characterization of its constituent materials. Furthermore, the research underlined the importance of the investigation of the whole structure of the pavement, included the foundation surface, in the interpretation of the archaeological context where it is located. A series of practical applications of the results of the research, in the designing of repair mortars for pavements, in the documentation of ancient pavements in the conservation practice, and in the presentation to public in situ and in museums of ancient pavements, have been suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Koyukuk Mining District was one of several northern, turn of the century, gold rush regions. Miners focused their efforts in this region on the Middle Fork of the Koyukuk River and on several of its tributaries. Mining in the Koyukuk began in the 1880s and the first rush occurred in 1898. Continued mining throughout the early decades of the 1900s has resulted in an historic mining landscape consisting of structures, equipment, mining shafts, waste rock, trash scatters, and prospect pits. Modern work continues in the region alongside these historic resources. An archaeological survey was completed in 2012 as part of an Abandoned Mine Lands survey undergone with the Bureau of Land Management, Michigan Technological University, and the University of Alaska Anchorage. This thesis examines the discrepancy between the size of mining operations and their respective successes in the region while also providing an historical background on the region and reports on the historical resources present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study describes the development and validation of a gas chromatography-mass spectrometry (GC-MS) method to identify and quantitate phenytoin in brain microdialysate, saliva and blood from human samples. A solid-phase extraction (SPE) was performed with a nonpolar C8-SCX column. The eluate was evaporated with nitrogen (50°C) and derivatized with trimethylsulfonium hydroxide before GC-MS analysis. As the internal standard, 5-(p-methylphenyl)-5-phenylhydantoin was used. The MS was run in scan mode and the identification was made with three ion fragment masses. All peaks were identified with MassLib. Spiked phenytoin samples showed recovery after SPE of ≥94%. The calibration curve (phenytoin 50 to 1,200 ng/mL, n = 6, at six concentration levels) showed good linearity and correlation (r² > 0.998). The limit of detection was 15 ng/mL; the limit of quantification was 50 ng/mL. Dried extracted samples were stable within a 15% deviation range for ≥4 weeks at room temperature. The method met International Organization for Standardization standards and was able to detect and quantify phenytoin in different biological matrices and patient samples. The GC-MS method with SPE is specific, sensitive, robust and well reproducible, and is therefore an appropriate candidate for the pharmacokinetic assessment of phenytoin concentrations in different human biological samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Measuring trait mindfulness and change in mindfulness may be a crucial prerequisite for the evaluation and further development of mindfulness based interventions for the treatment of mental disorders. This endeavour is nontrivial as current measures cover varying aspects and mindfulness and may have problems regarding validity. This presentation describes the development and validation of a questionnaire for the comprehensive assessment of mindfulness: the Comprehensive Inventory of Mindfulness Experiences (CHIME). Method: The factor structure, reliability, and validity of the CHIME were established in a community sample (N = 298) and a sample of MBSR group participants (N = 161). Results: Factor-analytical procedures supported an eight-factor structure. The structure was tested in a further confirmatory sample (N = 202). The questionnaire and its subscales exhibited good reliability (internal consistency and retest-reliability). Analysis of the measurement invariance of the single items over groups differing in age, gender, meditation experience, and symptom load pointed to the absence of systematic differences in the items' semantic understanding. Parameters reflecting construct validity, criterion validity, and incremental validity as well as change sensitivity were all at least satisfactory. Conclusions: The CHIME is a self-report measure with favorable psychometric properties based on all aspects of mindfulness that are included in current mindfulness scales. This scale may be helpful in the evaluation and further development of mindfulness based interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An integrated instrument package for measuring and understanding the surface radiation budget of sea ice is presented, along with results from its first deployment. The setup simultaneously measures broadband fluxes of upwelling and downwelling terrestrial and solar radiation (four components separately), spectral fluxes of incident and reflected solar radiation, and supporting data such as air temperature and humidity, surface temperature, and location (GPS), in addition to photographing the sky and observed surface during each measurement. The instruments are mounted on a small sled, allowing measurements of the radiation budget to be made at many locations in the study area to see the effect of small-scale surface processes on the large-scale radiation budget. Such observations have many applications, from calibration and validation of remote sensing products to improving our understanding of surface processes that affect atmosphere-snow-ice interactions and drive feedbacks, ultimately leading to the potential to improve climate modelling of ice-covered regions of the ocean. The photographs, spectral data, and other observations allow for improved analysis of the broadband data. An example of this is shown by using the observations made during a partly cloudy day, which show erratic variations due to passing clouds, and creating a careful estimate of what the radiation budget along the observed line would have been under uniform sky conditions, clear or overcast. Other data from the setup's first deployment, in June 2011 on fast ice near Point Barrow, Alaska, are also shown; these illustrate the rapid changes of the radiation budget during a cold period that led to refreezing and new snow well into the melt season.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study focuses on the technological intensity of China's exports. It first introduces the method of decomposing gross exports by using the Asian international input–output tables. The empirical results indicate that the technological intensity of Chinese exports has been significantly overestimated due to its high dependency on import content, especially in high-technology exports, an area highly dominated by the electronic and electrical equipment sector. Furthermore, a significant portion of value added embodied in China's high-technology exports comes from services and high-technology manufacturers in neighboring economies, such as Japan, South Korea, and Taiwan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless communication is the transfer of information from one place to another without using wires. From the earliest times, humans have felt the need to develop techniques of remote communication. From this need arose the smoke signals, communication by sun reflection in mirrors and so on. But today the telecommunications electronic devices such as telephone, television, radio or computer. Radio and television are used for one-way communication. Telephone and computer are used for two-way communication. In wireless networks there is almost unlimited mobility, we can access the network almost anywhere or anytime. In wired networks we have the restriction of using the services in fixed area services. The demand of the wireless is increasing very fast; everybody wants broadband services anywhere anytime. WiMAX (Worldwide Interoperability for Microwave Access) is a broadband wireless technology based on IEEE 802.16-2004 and IEEE 802.16e-2005 that appears to solve this demand. WIMAX is a system that allows wireless data transmission in areas of up to 48 km of radius. It is designed as a wireless alternative to ADSL and a way to connect nodes in wireless metropolitan areas network. Unlike wireless systems that are limited in most cases, about 100 meter, providing greater coverage and more bandwidth. WIMAX promises to achieve high data transmission rates over large areas with a great amount of users. This alternative to the networks of broadband access common as DSL o Wi-Fi, can give broadband access to places quickly to rural areas and developing areas around the world. This paper is a study of WIMAX technology and market situation. First, the paper is responsible for explaining the technical aspects of WIMAX. For this gives an overview of WIMAX standards, physical layer, MAC layer and WiMAX, Technology and Market Beijing University of Post and Telecommunications 2 WIMAX network architecture. Second, the paper address the issue of market in which provides an overview of development and deployment of WIMAX to end the future development trend of WIMAX is addressed. RESUMEN: Por comunicaciones inalámbricas se entiende la transferencia de información desde un lugar a otro sin la necesidad de un soporte físico como es por ejemplo el cable. Por lo que remontándose a los principios de la existencia del ser humano, nos damos cuenta de que el ser humano siempre ha sentido la necesidad de desarrollar técnicas para lograr comunicarse a distancia con sus semejantes. De dicha necesidad, surgieron técnicas tan ancestrales como puede ser la comunicación mediante señales de humo o por reflexión de los rayos solares en espejos entre otras. La curiosidad del ser humano y la necesidad de comunicarse a distancia fue la que llevó a Alexander Graham Bell a inventar el teléfono en 1876. La aparición de un dispositivo que permitía comunicarse a distancia permitiendo escuchar la voz de aquella persona con la que se quería hablar, supuso una revolución no solo en el panorama tecnológico, si no también en el panorama social. Pues a parte de permitir comunicaciones a larga distancia, solventó el problema de la comunicación en “tiempo real”. A raíz de este invento, la tecnología en materia de comunicación ha ido avanzando significativamente, más concretamente en lo referido a las comunicaciones inalámbricas. En 1973 se realizó la primera llamada desde un terminal móvil aunque no fue hasta 1983 cuando se empezó a comercializar dicho terminal, lo que supuso un cambio de hábitos y costumbres para la sociedad. Desde la aparición del primer móvil el crecimiento del mercado ha sido exponencial, lo que ha repercutido en una demanda impensable de nuevas aplicaciones integradas en dichos dispositivos móviles que satisfagan las necesidades que día a día autogenera la sociedad. Tras conseguir realizar llamadas a larga distancia de forma inalámbrica, el siguiente paso fue la creación de los SMS (Short Message System) lo que supuso una nueva revolución además de abaratar costes al usuario a la hora de comunicarse. Pero el gran reto para la industria de las comunicaciones móviles surgió con la aparición de internet. Todo el mundo sentía la necesidad de poder conectarse a esa gran base de datos que es internet en cualquier parte y en cualquier momento. Las primeras conexiones a internet desde dispositivos móviles se realizaron a través de la tecnología WAP (Wireless Application Protocol) hasta la aparición de la tecnología GPRS que permitía la conexión mediante protocolo TCP/IP. A partir de estas conexiones han surgido otras tecnologías, como EDGE, HSDPA, etc., que permitían y permiten la conexión a internet desde dispositivos móviles. Hoy en día la demanda de servicios de red inalámbrica crece de forma rápida y exponencial, todo el mundo quiere servicios de banda ancha en cualquier lugar y en cualquier momento. En este documento se analiza la tecnología WiMAX ( Worldwide Interoperability for Microwave Access) que es una tecnología de banda ancha basada en el estándar IEEE 802.16 creada para brindar servicios a la demanda emergente en la banda ancha desde un punto de vista tecnológico, donde se da una visión de la parte técnica de la tecnología; y desde el punto de vista del mercado, donde se analiza el despliegue y desarrollo de la tecnología desde el punto de vista de negocio. WiMAX es una tecnología que permite la transmisión inalámbrica de datos en áreas de hasta 48Km de radio y que está diseñada como alternativa inalámbrica para ADSL y para conectar nodos de red inalámbrica en áreas metropolitanas. A diferencia de los sistemas inalámbricos existentes que están limitados en su mayoría a unos cientos de metros, WiMAX ofrece una mayor cobertura y un mayor ancho de banda que permita dar soporte a nuevas aplicaciones, además de alcanzar altas tasas de transmisión de datos en grandes áreas con una gran cantidad de usuarios. Se trata de una alternativa a las redes de acceso de banda ancha como DSL o Wi-Fi, que puede dar acceso de banda ancha a lugares tales como zonas rurales o zonas en vías de desarrollo por todo el mundo con rapidez. Existen dos tecnologías de WiMAX, WiMAX fijo (basado en el estándar IEEE 802.16d-2004) y WiMAX móvil (basado en el estándar IEEE 802.16e-2005). La tecnología fija está diseñada para comunicaciones punto a multipunto, mientras que la fija lo está para comunicaciones multipunto a multipunto. WiMAX móvil se basa en la tecnología OFDM que ofrece ventajas en términos de latencia, eficiencia en el uso del espectro y soporte avanzado para antenas. La modulación OFDM es muy robusta frente al multitrayecto, que es muy habitual en los canales de radiodifusión, frente al desvanecimiento debido a las condiciones meteorológicas y frente a las interferencias de RF. Una vez creada la tecnología WiMAX, poseedora de las características idóneas para solventar la demanda del mercado, ha de darse el siguiente paso, hay que convencer a la industria de las telecomunicaciones de que dicha tecnología realmente es la solución para que apoyen su implantación en el mercado de la banda ancha para las redes inalámbricas. Es aquí donde entra en juego el estudio del mercado que se realiza en este documento. WiMAX se enfrenta a un mercado exigente en el que a parte de tener que dar soporte a la demanda técnica, ha de ofrecer una rentabilidad económica a la industria de las comunicaciones móviles y más concretamente a las operadoras móviles que son quienes dentro del sector de las telecomunicaciones finalmente han de confiar en la tecnología para dar soporte a sus usuarios ya que estos al fin y al cabo lo único que quieren es que su dispositivo móvil satisfaga sus necesidades independientemente de la tecnología que utilicen para tener acceso a la red inalámbrica de banda ancha. Quizás el mayor problema al que se ha enfrentado WiMAX haya sido la situación económica en la que se encuentra el mundo. WiMAX a comenzado su andadura en uno de los peores momentos, pero aun así se presenta como una tecnología capaz de ayudar al mundo a salir hacia delante en estos tiempos tan duros. Finalmente se analiza uno de los debates existentes hoy en día en el sector de las comunicaciones móviles, WiMAX vs. LTE. Como se puede observar en el documento realmente una tecnología no saldrá victoriosa frente a la otra, si no que ambas tecnologías podrán coexistir y trabajar de forma conjunta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe the successful results of an international research project focused on the use of Web technology in the educational context. The article explains how this international project, funded by public organizations and developed over the last two academic years, focuses on the area of open educational resources (OER) and particularly the educational content of the OpenCourseWare (OCW) model. This initiative has been developed by a research group composed of researchers from three countries. The project was enabled by the Universidad Politécnica de Madrid OCW Office�s leadership of the Consortium of Latin American Universities and the distance education know-how of the Universidad Técnica Particular de Loja (UTPL, Ecuador). We give a full account of the project, methodology, main outcomes and validation. The project results have further consolidated the group, and increased the maturity of group members and networking with other groups in the area. The group is now participating in other research projects that continue the lines developed here

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional postconditions, properties at arbitrary program points, and certain computational properties. The implemented transformation includes several optimizations to reduce run-time overhead. We also propose a minimal addition to the assertion language which allows defining unit tests to be run in order to detect possible violations of the (partial) specifications expressed by the assertions. This language can express for example the input data for performing the unit tests or the number of times that the unit tests should be repeated. We have implemented the framework within the Ciao/CiaoPP system and effectively applied it to the verification of ISO-prolog compliance and to the detection of different types of bugs in the Ciao system source code. Several experimental results are presented that ¡Ilústrate different trade-offs among program size, running time, or levéis of verbosity of the messages shown to the user.