954 resultados para Computer software - Quality control
Resumo:
Shrimp Aquaculture has provided tremendous opportunity for the economic and social upliftment of rural communities in the coastal areas of our country Over a hundred thousand farmers, of whom about 90% belong to the small and marginal category, are engaged in shrimp farming. Penaeus monodon is the most predominant cultured species in India which is mainly exported to highly sophisticated, quality and safety conscious world markets. Food safety has been of concem to humankind since the dawn of history and the concern about food safety resulted in the evolution of a cost effective, food safety assurance method, the Hazard Analysis Critical Control Point (HACCP). Considering the major contribution of cultured Penaeus monodon to the total shrimp production and the economic losses encountered due to disease outbreak and also because traditional methods of quality control and end point inspection cannot guarantee the safety of our cultured seafood products, it is essential that science based preventive approaches like HACCP and Pre requisite Programmes (PRP) be implemented in our shrimp farming operations. PRP is considered as a support system which provides a solid foundation for HACCP. The safety of postlarvae (PL) supplied for brackish water shrimp farming has also become an issue of concern over the past few years. The quality and safety of hatchery produced seeds have been deteriorating and disease outbreaks have become very common in hatcheries. It is in this context that the necessity for following strict quarantine measures with standards and code of practices becomes significant. Though there were a lot of hue and cry on the need for extending the focus of seafood safety assurance from processing and exporting to the pre-harvest and hatchery rearing phases, an experimental move in this direction has been rare or nil. An integrated management system only can assure the effective control of the quality, hygiene and safety related issues. This study therefore aims at designing a safety and quality management system model for implementation in shrimp farming and hatchery operations by linking the concepts of HACCP and PRP.
Resumo:
Quality related problems have become dominant in the seafood processing industry in Kerala. This has resulted in the rejection of seafood sent from India to many destinations. The latest being the total block listing of seafood companies from India from being exported to Europe and partial block listing by the US. The quality systems prevailed in the seafood industry in India were outdated and no longer in use in the developed world. According to EC Directive discussed above all the seafood factories exporting to European countries have to adopt HACCP. Based on this, EIA has now made HACCP system mandatory in all the seafood processing factories in India. This transformation from a traditional product based inspection system to a process control system requires thorough changes in the various stages of production and quality management. This study is conducted by the author with to study the status of the existing infrastructure and quality control system in the seafood industry in Kerala with reference to the recent developments in the quality concepts in international markets and study the drawbacks, if any, of the existing quality management systems in force in the seafood factories in Kerala for introducing the mandatory HACCP concept. To assess the possibilities of introducing Total Quality Management system in the seafood industry in Kerala in order to effectively adopt the HACCP concept. This is also aimed at improving the quality of the products and productivity of the industry by sustaining the world markets in the long run.
Resumo:
Exhaustive statistical information-gathering operations pose major logistical challenges. By using GISs, managing the associated information becomes simpler, and monitoring the quality control of the information gathered can be stricter
Resumo:
Desde hace más de 5 años DIELMO 3D S.L. ha estado trabajando en el desarrollo de software para el procesado de datos LiDAR, usándolo internamente para la producción de Modelos Digitales del Terreno (MDT) en numerosos proyectos. Finalmente, con la ayuda de la CIT hemos decidido tomar la iniciativa de crear un nuevo software libre basado en gvSIG para el manejo de datos LiDAR. En primer lugar haremos una introducción a la tecnología LiDAR, comentando los fundamentos básicos, los datos originales que se obtienen después de un vuelo y las aplicaciones o productos finales que se pueden generar a partir de la tecnología LiDAR. Una vez familiarizados con este tipo de datos, mostraremos el uso del driver desarrollado por DIELMO para el acceso, visualización y análisis de datos LiDAR originales en formato LAS y BIN para grandes volúmenes de datos. Por último comentaremos las herramientas que se están desarrollando para realizar controles de calidad a los datos originales y generar diferentes productos finales a partir de los datos originales: Modelo Digital del Terreno (MDT), Modelo Digital de Superficie (MDS) e imagen de intensidades
Resumo:
In this article, the results of a modified SERVQUAL questionnaire (Parasuraman et al., 1991) are reported. The modifications consisted in substituting questionnaire items particularly suited to a specific service (banking) and context (county of Girona, Spain) for the original rather general and abstract items. These modifications led to more interpretable factors which accounted for a higher percentage of item variance. The data were submitted to various structural equation models which made it possible to conclude that the questionnaire contains items with a high measurement quality with respect to five identified dimensions of service quality which differ from those specified by Parasuraman et al. And are specific to the banking service. The two dimensions relating to the behaviour of employees have the greatest predictive power on overall quality and satisfaction ratings, which enables managers to use a low-cost reduced version of the questionnaire to monitor quality on a regular basis. It was also found that satisfaction and overall quality were perfectly correlated thus showing that customers do not perceive these concepts as being distinct
Resumo:
We present a system for dynamic network resource configuration in environments with bandwidth reservation. The proposed system is completely distributed and automates the mechanisms for adapting the logical network to the offered load. The system is able to manage dynamically a logical network such as a virtual path network in ATM or a label switched path network in MPLS or GMPLS. The system design and implementation is based on a multi-agent system (MAS) which make the decisions of when and how to change a logical path. Despite the lack of a centralised global network view, results show that MAS manages the network resources effectively, reducing the connection blocking probability and, therefore, achieving better utilisation of network resources. We also include details of its architecture and implementation
Resumo:
The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies
Resumo:
El proyecto será desarrollado en base al modelo ecológico del desarrollo humano, (Bronfenbrenner, 1999) partiendo desde la explicación y conceptualización del modelo en términos generales, guiando la investigación hacia un ámbito organizacional en donde se podrá aplicar la teoría descrita por Bronfenbrenner y así, determinar cuál es la estructura y funcionalidad de los sistemas en el modelo además de establecer qué utilidad tiene en entornos empresariales por medio del análisis de los múltiples sistemas, relaciones, interacciones y efectos que tienen y que desarrollan las empresas u organizaciones en el transcurso de su vida. A lo largo de la investigación se hará referencia a diferentes conceptos relacionados tanto con el modelo como con el mundo en que se desarrollan las organizaciones, tales como clusters, sistemas, sectores, estrategias, marketing relacional, comunidad, interacciones, influencias, entre otros; los cuales permitirán acercar lo mayor posible el modelo de Bronfenbrenner al mundo empresarial y lograr desarrollar de mejor manera la intención de aplicar el modelo al mundo organizacional.
Resumo:
Dentro de las actividades para el control de calidad en el laboratorio, los resultados finales de un analito en particular son considerados productos intermedios, dada la pertinencia otorgada al aseguramiento de la calidad como fin último de los programas de gestión de la calidad. Esta concepción precisa el establecimiento de instrumentos integrales para la detección de eventos como la contaminación cruzada y la adopción de medidas para evitar que se afecte la marcha analítica. Objetivo: el objetivo principal fue establecer un sistema para el monitoreo y control de la contaminación cruzada en el laboratorio de análisis microbiológico de alimentos. Materiales y métodos: la metodología empleada consistió en desarrollar diagramas de flujo para los procedimientos sobre el control de las poblaciones de mesófilos aerobios y mohos provenientes de la contaminación en los ambientes, superficies, material estéril y medios de cultivos. Dichos diagramas incluyeron un árbol de decisiones, diseñado para efectuar acciones de control con base en los intervalos de tolerancia, establecidos como herramienta objetiva hacia la toma de decisiones que normalicen los recuentos de las poblaciones microbianas en cuestión. Resultados: los límites de alerta más estrictos se obtuvieron para las poblaciones de mesófilos aerobios y mohos en los diferentes controles, excepto para el ambiente del área de preparación de medios y los correspondientes al material estéril. Conclusión: el proceso desarrollado permitió complementar el sistema de control de calidad interno en el laboratorio, al disponer de un medio objetivo para el cierre de no conformidades por contaminación cruzada.
Resumo:
Emplear la seguridad como herramienta de competitividad se traducirá a largo plazo en la eliminación o disminución de todos los gastos innecesarios que se generan por errores, desperdicios, fugas, accidentes, interrupciones no programadas, etc.
Resumo:
A la Universitat Autònoma de Barcelona (UAB) s'ha establert un Sistema Intern de Qualitat (SIQ) que mostra el compromís per oferir uns programes de qualitat que incloguin en el seu funcionament mesures per assegurar, entre d'altres, una correcta sistematització del procés d'avaluació de competències. Per dur a terme aquesta mesura s'ha dissenyat un model de guia docent únic i s'ha desenvolupat un aplicatiu informàtic que permetrà facilitar aquesta tasca als col-lectius implicats i, alhora, disposar de la informació necessària per realitzar el procés de seguiment. En aquest article es presenta el protocol definit, l'estructura de la guia docent i l'aplicació informàtica de suport desenvolupada
Resumo:
En el Centre d'Investigació en Robòtica Submarina (CIRS) de la Universitat de Gironaes disposa de diferents robots submarins els quals utilitzen una arquitectura software anomenada Component Oriented Layered-based Architecture for Autonomy ( COLA2 ), la qual ha estat desenvolupada per estudiants i professors del mateix centre. Per tal de fer aquesta arquitectura més accessible per a professors i estudiant d’altres centres la COLA2 s’està adaptant al Robot Operative System (ROS) que és un framework genèric per al desenvolupament d’aplicacions amb robots. Aquest projecte pretén dissenyar un comportament per al robot Girona500 que estigui desenvolupat dins la versió ROS de l’arquitectura COLA2. El comportament haurà de fer mantenir una determinada posició al robot amb informació visual de la càmera del robot i amb dades de navegació. La tasca de mantenir la posició es de vital importància per a poder realitzar intervencions submarines que requereixen de precisió i, precisament, el medi on es treballa no ajuda
Resumo:
El grup de Visió per Computador i Robòtica (VICOROB) disposa de varis robots submarins per a la recerca i inspecció subaquàtica. Recentment s’ha adquirit un sensor sonar d’escombrat lateral el qual s’utilitza per realitzar imatges acústiques del fons marí quan aquest es mou principalment a velocitat constant i mantenint el rumb. Els robots del grup VICOROB estan equipats amb diferents tipus de sensors i càmeres per analitzar el fons marí. Aquest sensors són de gran qualitat i permeten conèixer de manera bastant satisfactòria l’entorn a les proximitats del robot. Freqüentment però, aquest sensors estant sotmesos a diferents restriccions depenent de la seva naturalesa de funcionament, de tal manera que es necessària la seva combinació per resoldre determinats problemes en diferents situacions. Amb aquest projecte, es pretén integrar un nou sistema de captura d’imatges sonores del fons marí, en un dels robots. Amb la integració d’aquest nou sensor, s’espera obtenir una opció alternativa els sistemes actuals que pugui aportar informació addicional sobre el fons. Aquest sistema podrà ser utilitzat per realitzar tasques per les quals els altres sensors no estant preparats o bé per complementar informació d’altres sensor
Resumo:
The human visual ability to perceive depth looks like a puzzle. We perceive three-dimensional spatial information quickly and efficiently by using the binocular stereopsis of our eyes and, what is mote important the learning of the most common objects which we achieved through living. Nowadays, modelling the behaviour of our brain is a fiction, that is why the huge problem of 3D perception and further, interpretation is split into a sequence of easier problems. A lot of research is involved in robot vision in order to obtain 3D information of the surrounded scene. Most of this research is based on modelling the stereopsis of humans by using two cameras as if they were two eyes. This method is known as stereo vision and has been widely studied in the past and is being studied at present, and a lot of work will be surely done in the future. This fact allows us to affirm that this topic is one of the most interesting ones in computer vision. The stereo vision principle is based on obtaining the three dimensional position of an object point from the position of its projective points in both camera image planes. However, before inferring 3D information, the mathematical models of both cameras have to be known. This step is known as camera calibration and is broadly describes in the thesis. Perhaps the most important problem in stereo vision is the determination of the pair of homologue points in the two images, known as the correspondence problem, and it is also one of the most difficult problems to be solved which is currently investigated by a lot of researchers. The epipolar geometry allows us to reduce the correspondence problem. An approach to the epipolar geometry is describes in the thesis. Nevertheless, it does not solve it at all as a lot of considerations have to be taken into account. As an example we have to consider points without correspondence due to a surface occlusion or simply due to a projection out of the camera scope. The interest of the thesis is focused on structured light which has been considered as one of the most frequently used techniques in order to reduce the problems related lo stereo vision. Structured light is based on the relationship between a projected light pattern its projection and an image sensor. The deformations between the pattern projected into the scene and the one captured by the camera, permits to obtain three dimensional information of the illuminated scene. This technique has been widely used in such applications as: 3D object reconstruction, robot navigation, quality control, and so on. Although the projection of regular patterns solve the problem of points without match, it does not solve the problem of multiple matching, which leads us to use hard computing algorithms in order to search the correct matches. In recent years, another structured light technique has increased in importance. This technique is based on the codification of the light projected on the scene in order to be used as a tool to obtain an unique match. Each token of light is imaged by the camera, we have to read the label (decode the pattern) in order to solve the correspondence problem. The advantages and disadvantages of stereo vision against structured light and a survey on coded structured light are related and discussed. The work carried out in the frame of this thesis has permitted to present a new coded structured light pattern which solves the correspondence problem uniquely and robust. Unique, as each token of light is coded by a different word which removes the problem of multiple matching. Robust, since the pattern has been coded using the position of each token of light with respect to both co-ordinate axis. Algorithms and experimental results are included in the thesis. The reader can see examples 3D measurement of static objects, and the more complicated measurement of moving objects. The technique can be used in both cases as the pattern is coded by a single projection shot. Then it can be used in several applications of robot vision. Our interest is focused on the mathematical study of the camera and pattern projector models. We are also interested in how these models can be obtained by calibration, and how they can be used to obtained three dimensional information from two correspondence points. Furthermore, we have studied structured light and coded structured light, and we have presented a new coded structured light pattern. However, in this thesis we started from the assumption that the correspondence points could be well-segmented from the captured image. Computer vision constitutes a huge problem and a lot of work is being done at all levels of human vision modelling, starting from a)image acquisition; b) further image enhancement, filtering and processing, c) image segmentation which involves thresholding, thinning, contour detection, texture and colour analysis, and so on. The interest of this thesis starts in the next step, usually known as depth perception or 3D measurement.
Resumo:
Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.