19 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Chronic graft-versus-host disease (cGvHD) is the leading cause of late nonrelapse mortality (transplant-related mortality) after hematopoietic stem cell transplant. Given that there are a wide range of treatment options for cGvHD, assessment of the associated costs and efficacy can help clinicians and health care providers allocate health care resources more efficiently. OBJECTIVE: The purpose of this study was to assess the cost-effectiveness of extracorporeal photopheresis (ECP) compared with rituximab (Rmb) and with imatinib (Imt) in patients with cGvHD at 5 years from the perspective of the Spanish National Health System. METHODS: The model assessed the incremental cost-effectiveness/utility ratio of ECP versus Rmb or Imt for 1000 hypothetical patients by using microsimulation cost-effectiveness techniques. Model probabilities were obtained from the literature. Treatment pathways and adverse events were evaluated taking clinical opinion and published reports into consideration. Local data on costs (2010 Euros) and health care resources utilization were validated by the clinical authors. Probabilistic sensitivity analyses were used to assess the robustness of the model. RESULTS: The greater efficacy of ECP resulted in a gain of 0.011 to 0.024 quality-adjusted life-year in the first year and 0.062 to 0.094 at year 5 compared with Rmb or Imt. The results showed that the higher acquisition cost of ECP versus Imt was compensated for at 9 months by greater efficacy; this higher cost was partially compensated for ( 517) by year 5 versus Rmb. After 9 months, ECP was dominant (cheaper and more effective) compared with Imt. The incremental cost-effectiveness ratio of ECP versus Rmb was 29,646 per life-year gained and 24,442 per quality-adjusted life-year gained at year 2.5. Probabilistic sensitivity analysis confirmed the results. The main study limitation was that to assess relative treatment effects, only small studies were available for indirect comparison. CONCLUSION: ECP as a third-line therapy for cGvHD is a more cost-effective strategy than Rmb or Imt.
Resumo:
La complexitat de disseny d’agents mòbils creix a mesura que s’incrementen les seves funcionalitats. Aquest projecte proposa enfocar el problema des d’un punt de vista modular. S’ha realitzat un estudi tant dels propis agents com de les parts que ho integren. De la mateixa forma, s’han establert i s'han implementat els mecanismes necessaris per habilitar les comunicacions segures entre agents. Finalment, s’han desenvolupat dos components que ofereixen les funcionalitats de seguiment de l’agent mòbil i la recuperació dels resultats generats. El desenvolupament d’agents basats en components tracta d’aplicar la vella estratègia "divideix i venceràs" a la fase de disseny, reduint, així,la seva gran complexitat.
Resumo:
Informe de investigación realizado a partir de una estancia en el Department of Computer and Information Science de la Norwegian University of Science and Technology (NTNU), Noruega, entre setiembre i diciembre de 2006. El uso de componentes de software llamados Commercial-Off-The-Shelf (COTS) en el desarrollo de sistemas basados en componentes implica varios retos. Uno de ellos es la falta de información disponible y adecuada para dar soporte al proceso de selección de componentes a ser integrados. Para lidiar con estos problemas, se esta desarrollando un trabajo de tesis que propone un método llamado GOThIC (Goal-Oriented Taxonomy and reuse Infrastructure Construction). El método está orientado a construir una infrastructura de reuse para facilitar la búsqueda y reuse de componentes COTS. La estancia en la NTNU, reportada en este documento, tuvo como objetivo primordial las mejora del método y la obtención de datos empíricos para darle soporte. Algunos de los principales resultados fueron la obtención de datos empíricos fundamentando la utilización del método en ámbitos industriales de selección de componentes COTS, así como una nueva estrategia para conseguir de forma factible e incremental, la federación y reuso de los diferentes esfuerzos existentes para encontrar, seleccionar y mantener componentes COTS y Open Source (OSS) -comúnmente llamados componentes Off-The-Shelf (OTS) - en forma estructurada.
Resumo:
We analyze recent contributions to growth theory based on the model of expanding variety of Romer (1990). In the first part, we present different versions of the benchmark linear model with imperfect competition. These include the labequipment model, labor-for-intermediates and directed technical change . We review applications of the expanding variety framework to the analysis of international technology diffusion, trade, cross-country productivity differences, financial development and fluctuations. In many such applications, a key role is played by complementarities in the process of innovation.
Resumo:
Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants
Resumo:
We use a dynamic monopolistic competition model to show that an economythat inherits a small range of specialized inputs can be trapped into alower stage of development. The limited availability of specialized inputsforces the final goods producers to use a labor intensive technology, whichin turn implies a small inducement to introduce new intermediate inputs. Thestart--up costs, which make the intermediate inputs producers subject todynamic increasing returns, and pecuniary externalities that result from thefactor substitution in the final goods sector, play essential roles in themodel.
Resumo:
Many workers believe that personal contacts are crucial for obtainingjobs in high-wage sectors. On the other hand, firms in high-wage sectorsreport using employee referrals because they help provide screening andmonitoring of new employees. This paper develops a matching model thatcan explain the link between inter-industry wage differentials and useof employee referrals. Referrals lower monitoring costs because high-effortreferees can exert peer pressure on co-workers, allowing firms to pay lowerefficiency wages. On the other hand, informal search provides fewer job andapplicant contacts than formal methods (e.g., newspaper ads). In equilibrium,the matching process generates segmentation in the labor market becauseof heterogeneity in the size of referral networks. Referrals match good high-paying jobs to well-connected workers, while formal methods matchless attractive jobs to less-connected workers. Industry-level data show apositive correlation between industry wage premia and use of employeereferrals. Moreover, evidence using the NLSY shows similar positive andsignificant OLS and fixed-effects estimates of the returns to employeereferrals, but insignificant effects once sector of employment is controlledfor. This evidence suggests referred workers earn higher wages not becauseof higher unobserved ability or better matches but rather because theyare hired in high-wage sectors.
Resumo:
This paper investigates the role of employee referrals in the labor market.Using an original data set, I find that industries that pay wage premia andhave characteristics associated with high-wage sectors rely mainly on employeereferrals to fill jobs. Moreover, unemployment rates are higher in industries which use employee referrals more extensively. This paper develops an equilibrium matching model which can explain these empirical regularities. Inthis model, the matching process sorts heterogeneous firms and workers into two distinct groups: referrals match "good" jobs to "good" workers, while formalmethods (e.g., newspaper ads and employment agencies) match less-attractive jobs to disadvantaged workers. Thus, well-connected workers who learn quickly aboutjob opportunities use referrals to jump job queues, while those who are less well placed in the labor market search for jobs through formal methods. The split of firms and workers between referrals and formal search is, however, not necessarily efficient. Congestion externalities in referral search imply that unemployment would be closer to the optimal rate if firms and workers 'at themargin' searched formally.
Resumo:
We construct a new family of semi-discrete numerical schemes for the approximation of the one-dimensional periodic Vlasov-Poisson system. The methods are based on the coupling of discontinuous Galerkin approximation to the Vlasov equation and several finite element (conforming, non-conforming and mixed) approximations for the Poisson problem. We show optimal error estimates for the all proposed methods in the case of smooth compactly supported initial data. The issue of energy conservation is also analyzed for some of the methods.
Resumo:
El projecte consisteix en l'estudi i avaluació de diferents alternatives existents al mercat per a realitzar l'anàlisi i desenvolupament d'un conjunt de components que constitueixin un marc de treball per a simplificar i agilitzar el desenvolupament de la capa de presentació per a les aplicacions de client prim d'un determinat Framework desenvolupades amb la plataforma J2EE i basats en el patró de disseny Model-Vista-Controlador.
Resumo:
One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events,especially with large databases.
Resumo:
Quality management has become a strategic issue for organisations and is very valuable to produce quality software. However, quality management systems (QMS) are not easy to implement and maintain. The authors' experience shows the benefits of developing a QMS by first formalising it using semantic web ontologies and then putting them into practice through a semantic wiki. The QMS ontology that has been developed captures the core concepts of a traditional QMS and combines them with concepts coming from the MPIu'a development process model, which is geared towards obtaining usable and accessible software products. Then, the ontology semantics is directly put into play by a semantics-aware tool, the Semantic MediaWiki. The developed QMS tool is being used for 2 years by the GRIHO research group, where it has manages almost 50 software development projects taking into account the quality management issues. It has also been externally audited by a quality certification organisation. Its users are very satisfied with their daily work with the tool, which manages all the documents created during project development and also allows them to collaborate, thanks to the wiki features.
Resumo:
Awareness is required for supporting all forms of cooperation. In Computer Supported Collaborative Learning (CSCL), awareness can be used for enhancing collaborative opportunities across physical distances and in computer-mediated environments. Shared Knowledge Awareness (SKA) intends to increase the perception about the shared knowledge, students have in a collaborative learning scenario and also concerns the understanding that this group has about it. However, it is very difficult to produce accurate awareness indicators based on informal message exchange among the participants. Therefore, we propose a semantic system for cooperation that makes use of formal methods for knowledge representation based on semantic web technologies. From these semantics-enhanced repository and messages, it could be easier to compute more accurate awareness.