935 resultados para Enterprise application integration (Computer systems)
Resumo:
Este trabajo de grado propone identificar la utilidad de las relaciones estratégicas comunitarias y el marketing en la administración de negocios con clientes corporativos, también se toman en cuenta conceptos como el marketing organizacional y relacional, estos conceptos ayudan en la investigación a determinar relaciones estratégicas entre las empresas, y el beneficio que estas le generan a las corporaciones; para así fomentar la implementación de estas estrategias en a las empresas a nivel nacional e internacional, así mismo, identificar el concepto de comunidad que tienen los clientes corporativos y como este concepto se puede adaptar al entorno que los rodea. Con el fin de entender las funciones y características de un cliente corporativo, así como su comportamiento, los objetivos específicos de la investigación son describir las estrategias de marketing en la administración de negocios con clientes corporativos, determinar si existe el concepto de comunidad en la administración de negocios con clientes corporativos y determinar si se utilizan relaciones estratégicas comunitarias en la administración de negocios con clientes corporativos. La metodología que se planteó usar fue teórica-conceptual, teniendo en cuenta el marketing y las relaciones estratégicas comunitarias de los clientes corporativos. Llevando la investigación al ámbito de la gerencia y dirección, los resultados que se obtuvieron gracias a la investigación, ayudaran a potenciar la dirección de las empresas, donde se evalué la verdadera utilidad de las estrategias basadas en las relaciones comunitarias y marketing en los negocios con clientes corporativos. Las estrategias comunitarias y el marketing influencian de manera directa las relaciones de las compañias con sus clientes corporativos, debido a que marketing nos permite extender la relación y generar una utilidad a futuro entre ambas partes. De la investigación se concluye que las empresas que logran crear estrategias comunitarias y relaciones estrechas entre ellas, tienden a tener mejores utilidades en el largo plazo y ser empresas más sostenibles.
Resumo:
La present tesi vol explicar la implantació de la Reforma Catòlica en una sèrie de parròquies rurals dels bisbats de Girona (valls de Ridaura, Bas, Hostoles i Amer) i Vic (El Collsacabra i les valls de Susqueda i Sau), entre 1587 i 1800, des dels bisbes posttrentins Jaume Caçador i Pedro Jaime als il·lustrats Tomàs de Lorenzana i Francisco de Veyan. La documentació principal són les sèries de les visites pastorals conservades a l'Arxiu Diocesà de Girona i l'Arxiu Episcopal de Vic; paral·lelament, s'ha reforçat amb documentació parroquial (llibres sagramentals, consuetes, llibres d'obra i confraries), protocols notarials (notaries de Rupit, Sant Feliu de Pallerols, El Mallol i Amer) i impresos episcopals. Els manaments de les visites pastorals s'han contrastat, amb semblances i diferències, amb els decrets del concili de Trento, de les constitucions provincials tarraconenses i les sinodals gironines i vigatanes, i amb les evidències artístiques, arquitectòniques i arqueològiques. Tots ells han servit per demostrar la lentitud en la implantació del programa tridentí, que s'assoleix, de fet, amb força retard (ben entrat el segle XVIII).
Resumo:
One of the main challenges for developers of new human-computer interfaces is to provide a more natural way of interacting with computer systems, avoiding excessive use of hand and finger movements. In this way, also a valuable alternative communication pathway is provided to people suffering from motor disabilities. This paper describes the construction of a low cost eye tracker using a fixed head setup. Therefore a webcam, laptop and an infrared lighting source were used together with a simple frame to fix the head of the user. Furthermore, detailed information on the various image processing techniques used for filtering the centre of the pupil and different methods to calculate the point of gaze are discussed. An overall accuracy of 1.5 degrees was obtained while keeping the hardware cost of the device below 100 euros.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.
Resumo:
As the ideal method of assessing the nutritive value of a feedstuff, namely offering it to the appropriate class of animal and recording the production response obtained, is neither practical nor cost effective a range of feed evaluation techniques have been developed. Each of these balances some degree of compromise with the practical situation against data generation. However, due to the impact of animal-feed interactions over and above that of feed composition, the target animal remains the ultimate arbitrator of nutritional value. In this review current in vitro feed evaluation techniques are examined according to the degree of animal-feed interaction. Chemical analysis provides absolute values and therefore differs from the majority of in vitro methods that simply rank feeds. However, with no host animal involvement, estimates of nutritional value are inferred by statistical association. In addition given the costs involved, the practical value of many analyses conducted should be reviewed. The in sacco technique has made a substantial contribution to both understanding rumen microbial degradative processes and the rapid evaluation of feeds, especially in developing countries. However, the numerous shortfalls of the technique, common to many in vitro methods, the desire to eliminate the use of surgically modified animals for routine feed evaluation, paralleled with improvements in in vitro techniques, will see this technique increasingly replaced. The majority of in vitro systems use substrate disappearance to assess degradation, however, this provides no information regarding the quantity of derived end-products available to the host animal. As measurement of volatile fatty acids or microbial biomass production greatly increases analytical costs, fermentation gas release, a simple and non-destructive measurement, has been used as an alternative. However, as gas release alone is of little use, gas-based systems, where both degradation and fermentation gas release are measured simultaneously, are attracting considerable interest. Alternative microbial inocula are being considered, as is the potential of using multi-enzyme systems to examine degradation dynamics. It is concluded that while chemical analysis will continue to form an indispensable part of feed evaluation, enhanced use will be made of increasingly complex in vitro systems. It is vital, however, the function and limitations of each methodology are fully understood and that the temptation to over-interpret the data is avoided so as to draw the appropriate conclusions. With careful selection and correct application in vitro systems offer powerful research tools with which to evaluate feedstuffs. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware or faulty system or application software. Monitoring systems such as GridRM provide the ability to connect to any number of different types of monitoring agents and provide different views of the system, based on a client's particular preferences. Web 2.0 technologies, and in particular 'mashups', are emerging as a promising technique for rapidly constructing rich user interfaces, that combine and present data in intuitive ways. This paper describes a Web 2.0 user interface that was created to expose resource data harvested by the GridRM resource monitoring system.
Resumo:
Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Monitoring Earth's terrestrial water conditions is critically important to many hydrological applications such as global food production; assessing water resources sustainability; and flood, drought, and climate change prediction. These needs have motivated the development of pilot monitoring and prediction systems for terrestrial hydrologic and vegetative states, but to date only at the rather coarse spatial resolutions (∼10–100 km) over continental to global domains. Adequately addressing critical water cycle science questions and applications requires systems that are implemented globally at much higher resolutions, on the order of 1 km, resolutions referred to as hyperresolution in the context of global land surface models. This opinion paper sets forth the needs and benefits for a system that would monitor and predict the Earth's terrestrial water, energy, and biogeochemical cycles. We discuss six major challenges in developing a system: improved representation of surface‐subsurface interactions due to fine‐scale topography and vegetation; improved representation of land‐atmospheric interactions and resulting spatial information on soil moisture and evapotranspiration; inclusion of water quality as part of the biogeochemical cycle; representation of human impacts from water management; utilizing massively parallel computer systems and recent computational advances in solving hyperresolution models that will have up to 109 unknowns; and developing the required in situ and remote sensing global data sets. We deem the development of a global hyperresolution model for monitoring the terrestrial water, energy, and biogeochemical cycles a “grand challenge” to the community, and we call upon the international hydrologic community and the hydrological science support infrastructure to endorse the effort.
Resumo:
Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.
Resumo:
Objective. This study was designed to determine the precision and accuracy of angular measurements using three-dimensional computed tomography (3D-CT) volume rendering by computer systems. Study design. The study population consisted of 28 dried skulls that were scanned with a 64-row multislice CT, and 3D-CT images were generated. Angular measurements, (n = 6) based upon conventional craniometric anatomical landmarks (n = 9), were identified independently in 3D-CT images by 2 radiologists, twice each, and were then performed by 3D-CT imaging. Subsequently, physical measurements were made by a third examiner using a Beyond Crysta-C9168 series 900 device. Results. The results demonstrated no statistically significant difference between interexaminer and intraexaminer analysis. The mean difference between the physical and 3-D-based angular measurements was -1.18% and -0.89%, respectively, for both examiners, demonstrating high accuracy. Conclusion. Maxillofacial analysis of angular measurements using 3D-CT volume rendering by 64-row multislice CT is established and can be used for orthodontic and dentofacial orthopedic applications.
Resumo:
This report presents a new way of control engineering. Dc motor speed controlled by three controllers PID, pole placement and Fuzzy controller and discusses the advantages and disadvantages of each controller for different conditions under loaded and unloaded scenarios using software Matlab. The brushless series wound Dc motor is very popular in industrial application and control systems because of the high torque density, high efficiency and small size. First suitable equations are developed for DC motor. PID controller is developed and tuned in order to get faster step response. The simulation results of PID controller provide very good results and the controller is further tuned in order to decrease its overshoot error which is common in PID controllers. Further it is purposed that in industrial environment these controllers are better than others controllers as PID controllers are easy to tuned and cheap. Pole placement controller is the best example of control engineering. An addition of integrator reduced the noise disturbances in pole placement controller and this makes it a good choice for industrial applications. The fuzzy controller is introduce with a DC chopper to make the DC motor speed control smooth and almost no steady state error is observed. Another advantage is achieved in fuzzy controller that the simulations of three different controllers are compared and concluded from the results that Fuzzy controller outperforms to PID controller in terms of steady state error and smooth step response. While Pole placement controller have no comparison in terms of controls because designer can change the step response according to nature of control systems, so this controller provide wide range of control over a system. Poles location change the step response in a sense that if poles are near to origin then step response of motor is fast. Finally a GUI of these three controllers are developed which allow the user to select any controller and change its parameters according to the situation.
Resumo:
Objetivou-se, nesta pesquisa, estudar os sistemas de custos das instituições particulares de ensino quanto à sua estrutura, funcionamento e utilização no processo de tomada de decisões, em relação à literatura sobre contabilidade de custos. Para tanto, procedeu-se a uma pesquisa bibliográfica e foram pesquisadas três instituições particulares de ensino; houve a opçãoo pelo método de estudo de casos, em virtude de este permitir maior riqueza de detalhes acerca dos sistemas encontrados. Os resultados obtidos permitiram analisar os tipos de sistemas implantados quanto a natureza dos custos, à forma de acumulação, aos métodos de custeio e à utilização dos sistemas para fins gerenciais. Finalmente, comparando-se os resultados obtidos e a literatura, foi possível alcançar algumas conclusões importantes, além de sugerir novas pesquisas e algumas recomendações.
Resumo:
o objetivo deste estudo foi explorar a relação existente entre a li teratura sobre procedimentos de auditoria de PED e os procedimentos efetivamente utilizados pelas seis empresas de auditoria contábil no Brasil. Bus cou-se identificar a diferença entre os procedimentos de auditoria de PED utilizados pelas empresas nacionais e os utilizados pelas empresas de ori gem estrangei ra (Capitulo I). Na revisão de literatura, apresentam-se os atuais conhecimentos so bre a auditoria externa em empresas que utilizam sistemas complexos de com putador e as perspectivas previsiveis para o futuro (Capltulo 11). A seguir, apresenta-se a metodologia utilizada, justificando-se as razões de seu emprego neste tipo de estudo exploratório (Capltulo lU). Entrevistas utilizando um questionário contendo,em sua maioria, que~ tões abertas, possibilitaram uma descrição dos procedimentos de auditoria externa empregados pelas empresas de auditoria em clientes que utilizam computador (Capitulo IV). Os resultados obtidos possibilitaram uma análise dos procedimentos de auditoria de PED utilizados pelas seis empresas de auditoria pesquis~ das (Capitulo V). Finalmente, relacionando-se os resultados ã literatura existente, sao apresentadas conclusões, formuladas recomendações e sugeridos novos es tudos (Capitulo VI).