983 resultados para Reasonable profits
Resumo:
Medicine counterfeiting is a serious worldwide issue, involving networks of manufacture and distribution that are an integral part of industrialized organized crime. Despite the potentially devastating health repercussions involved, legal sanctions are often inappropriate or simply not applied. The difficulty in agreeing on a definition of counterfeiting, the huge profits made by the counterfeiters and the complexity of the market are the other main reasons for the extent of the phenomenon. Above all, international cooperation is needed to thwart the spread of counterfeiting. Moreover effort is urgently required on the legal, enforcement and scientific levels. Pharmaceutical companies and agencies have developed measures to protect the medicines and allow fast and reliable analysis of the suspect products. Several means, essentially based on chromatography and spectroscopy, are now at the disposal of the analysts to enable the distinction between genuine and counterfeit products. However the determination of the components and the use of analytical data for forensic purposes still constitute a challenge. The aim of this review article is therefore to point out the intricacy of medicine counterfeiting so that a better understanding can provide solutions to fight more efficiently against it.
Resumo:
We investigate the hypothesis that the atmosphere is constrained to maximize its entropy production by using a one-dimensional (1-D) vertical model. We prescribe the lapse rate in the convective layer as that of the standard troposphere. The assumption that convection sustains a critical lapse rate was absent in previous studies, which focused on the vertical distribution of climatic variables, since such a convective adjustment reduces the degrees of freedom of the system and may prevent the application of the maximum entropy production (MEP) principle. This is not the case in the radiative–convective model (RCM) developed here, since we accept a discontinuity of temperatures at the surface similar to that adopted in many RCMs. For current conditions, the MEP state gives a difference between the ground temperature and the air temperature at the surface ≈10 K. In comparison, conventional RCMs obtain a discontinuity ≈2 K only. However, the surface boundary layer velocity in the MEP state appears reasonable (≈3 m s-¹). Moreover, although the convective flux at the surface in MEP states is almost uniform in optically thick atmospheres, it reaches a maximum value for an optical thickness similar to current conditions. This additional result may support the maximum convection hypothesis suggested by Paltridge (1978)
Resumo:
En aquests últims anys, són moltes les empreses que han optat per la utilització de sistemes de gestió normalitzats, per a garantir la rendibilitat i fiabilitat dels resultats de la implantació del sistema de gestió en qüestió. A la dècada dels 90 va ser quan la implantació de sistemes de gestió va començar a ser important en la majoria de sectors econòmics. L’evolució en els sistemes de gestió a trets generals va iniciar-se primerament en l’àmbit de la qualitat, seguidament en la gestió ambiental i en última instància en la prevenció de riscos laborals. Aquests tres tipus de sistemes de gestió, en els últims anys s’han anat integrant, de manera que s’han reduït els recursos i els esforços emprats en la gestió, millorant significativament l’eficàcia i l’eficiència d’aquests sistemes. L’objectiu principal que persegueix aquest projecte, és definir un sistema de gestió que permeti a l’empresa conduir les seves activitats de forma simplificada i ordenada, i que alhora faciliti la informació necessària per a corregir i millorar les activitats. Un altre objectiu que pretén aconseguir aquest projecte, és el de dissenyar un SGI que aprofiti les sinèrgies generades en els diferents àmbits de la pròpia empresa i fomenti les interaccions entre els diferents nivells de l’organització. En conseqüència, millorarà de forma important els fluxos d’informació dins de l’empresa minimitzant els esforços i la pèrdua d’informació. El mètode escollit per a la implantació del SGI, ha estat la Gestió per Processos, la qual es basa en la definició i seguiment dels processos de l’empresa, partint de les necessitats del client i acabant quan aquestes estan satisfetes. En conclusió, a la finalització del present projecte s’obtindrà un SGI, amb tots els processos de l’empresa definits i implantats, que doni compliment a les normes UNEEN-ISO 9001:00, UNE-EN-ISO 14001:04 i OHSAS 18001:07. Aquest SGI, que s’ha realitzat des d’un punt de vista documental i teòric, suposarà una millora de l’eficàcia operativa dels processos i una important millora competitiva de l’empresa.
Resumo:
L'objectiu de la realització d'aquest treball és la creació d'un teclat virtual destinat a ajudar a persones amb mobilitat reduïda, que no poden utilitzar el teclat físic de l'ordinador, a escriure intentant aconseguir una velocitat d'escriptura raonable per a textos de qualsevol mida. Per aconseguir aquesta velocitat d'escriptura raonable s'ha implementat un sistema de predicció del llenguatge que té dos aspectes. D'una banda es prediuen paraules segons la seva freqüència d'ús en un determinat diccionari i, d'altra banda, es prediuen paraules seguint les regles d'escriptura de la gramàtica catalana. Un altre aspecte important era que el programa creat es pogués utilitzar en diferents sistemes operatius ja que només hi havia versions específiques per a cada un d'ells. El programa creat es pot executar en els sistemes operatius Windows XP, Mac OS i Ubuntu Linux. El programa creat pretén ser una base per a posteriors millores i ampliacions en diferents parts del seu conjunt. No obstant això, com a resultat s'ha obtingut un programa que permet escriure raonablement ràpid i permet a l'usuari gestionar diccionaris i els dos tipus de predicció que s'han implementat.
Resumo:
The control and prediction of wastewater treatment plants poses an important goal: to avoid breaking the environmental balance by always keeping the system in stable operating conditions. It is known that qualitative information — coming from microscopic examinations and subjective remarks — has a deep influence on the activated sludge process. In particular, on the total amount of effluent suspended solids, one of the measures of overall plant performance. The search for an input–output model of this variable and the prediction of sudden increases (bulking episodes) is thus a central concern to ensure the fulfillment of current discharge limitations. Unfortunately, the strong interrelationbetween variables, their heterogeneity and the very high amount of missing information makes the use of traditional techniques difficult, or even impossible. Through the combined use of several methods — rough set theory and artificial neural networks, mainly — reasonable prediction models are found, which also serve to show the different importance of variables and provide insight into the process dynamics
Resumo:
We present an approach for creating image mosaics using navigation data consisting on 3D position estimates provided by sensors such as LBL available in deep water surveys. A central issue with acoustic 3D positioning is that the accuracy is far too low compositing the images within reasonable accuracy
Resumo:
We prospectively evaluated the results of our custom cementless femoral stems to ascertain whether this technology produced reasonable clinical function, complication rates, and loosening rates at midterm. Fifty-seven consecutive patients had surgery in 62 hips for primary osteoarthritis at a mean age of 57 years using a three-dimensional computed custom cementless stem. Patients were reviewed at a mean followup of 94.9 months. At review, the mean Harris hip score was 98.8 points (range, 84-100) compared with 61.1 (range, 28-78) points preoperatively. No patient complained of thigh pain. No migration or subsidence was observed. All stems were considered stable according to the radiographic criteria defined by Engh et al. There were no dislocations, no infections, and no reoperations. Our results are comparable with published results from clinical and radiologic points of view. Two problems remain unsolved: the price of a custom stem is twice as expensive as a standard stem; and we need longer term results before definitely recommending this technology as a reasonable alternative to current arthroplasties in younger patients. The data support the continued exploration of this technology with controlled clinical followup. LEVEL OF EVIDENCE: Therapeutic study, Level II-1 (prospective cohort study). See the Guidelines to Authors for a complete description of levels of evidence.
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.
Resumo:
Extensible Markup Language (XML) is a generic computing language that provides an outstanding case study of commodification of service standards. The development of this language in the late 1990s marked a shift in computer science as its extensibility let store and share any kind of data. Many office suites software rely on it. The chapter highlights how the largest multinational firms pay special attention to gain a recognised international standard for such a major technological innovation. It argues that standardisation processes affects market structures and can lead to market capture. By examining how a strategic use of standardisation arenas can generate profits, it shows that Microsoft succeeded in making its own technical solution a recognised ISO standard in 2008, while the same arena already adopted two years earlier the open source standard set by IBM and Sun Microsystems. Yet XML standardisation also helped to establish a distinct model of information technology services at the expense of Microsoft monopoly on proprietary software
Resumo:
BACKGROUND: Transient balanced steady-state free-precession (bSSFP) has shown substantial promise for noninvasive assessment of coronary arteries but its utilization at 3.0 T and above has been hampered by susceptibility to field inhomogeneities that degrade image quality. The purpose of this work was to refine, implement, and test a robust, practical single-breathhold bSSFP coronary MRA sequence at 3.0 T and to test the reproducibility of the technique. METHODS: A 3D, volume-targeted, high-resolution bSSFP sequence was implemented. Localized image-based shimming was performed to minimize inhomogeneities of both the static magnetic field and the radio frequency excitation field. Fifteen healthy volunteers and three patients with coronary artery disease underwent examination with the bSSFP sequence (scan time = 20.5 ± 2.0 seconds), and acquisitions were repeated in nine subjects. The images were quantitatively analyzed using a semi-automated software tool, and the repeatability and reproducibility of measurements were determined using regression analysis and intra-class correlation coefficient (ICC), in a blinded manner. RESULTS: The 3D bSSFP sequence provided uniform, high-quality depiction of coronary arteries (n = 20). The average visible vessel length of 100.5 ± 6.3 mm and sharpness of 55 ± 2% compared favorably with earlier reported navigator-gated bSSFP and gradient echo sequences at 3.0 T. Length measurements demonstrated a highly statistically significant degree of inter-observer (r = 0.994, ICC = 0.993), intra-observer (r = 0.894, ICC = 0.896), and inter-scan concordance (r = 0.980, ICC = 0.974). Furthermore, ICC values demonstrated excellent intra-observer, inter-observer, and inter-scan agreement for vessel diameter measurements (ICC = 0.987, 0.976, and 0.961, respectively), and vessel sharpness values (ICC = 0.989, 0.938, and 0.904, respectively). CONCLUSIONS: The 3D bSSFP acquisition, using a state-of-the-art MR scanner equipped with recently available technologies such as multi-transmit, 32-channel cardiac coil, and localized B0 and B1+ shimming, allows accelerated and reproducible multi-segment assessment of the major coronary arteries at 3.0 T in a single breathhold. This rapid sequence may be especially useful for functional imaging of the coronaries where the acquisition time is limited by the stress duration and in cases where low navigator-gating efficiency prohibits acquisition of a free breathing scan in a reasonable time period.
Resumo:
This manual captures the experience of practitioners in the Iowa Department of Transportation’s (Iowa DOT’s) Office of Location and Environment (OLE). It also documents the need for coordinated project development efforts during the highway project planning, or location study phase and engineering design. The location study phase establishes: * The definition of, and need for, the highway improvement project * The range of alternatives and many key attributes of the project’s design * The recommended alternative, its impacts, and the agreed-to conditions for project approval The location study process involves developing engineering alternatives, collecting engineering and environmental data, and completing design refinements to accomplish functional designs. The items above also embody the basic content required for projects compliant with the National Environmental Policy Act (NEPA) of 19691, which directs federal agencies to use a systematic, interdisciplinary approach during the planning process whenever proposed actions (or “projects”) have the potential for environmental impacts. In doing so, NEPA requires coordination with stakeholders, review, comment, and public disclosure. Are location studies and environmental studies more about the process or the documents? If properly conducted, they concern both—unbiased and reasonable processes with quality and timely documents. In essence, every project is a story that needs to be told. Engineering and environmental regulations and guidance, as documented in this manual, will help project staff and managers become better storytellers.
Resumo:
We analyse the strategic behaviours of agents in a market through the appropriate¬ness of their skills to the market. If agents' skills are well adapted to market and they can reach their target, they will not need to adopt strategic behaviours. The agents will behave as selfish individuals. However, if their skills are not well adapted and they cannot attain their target alone, they will adopt strategic behaviours to reach their objectives. These behaviours will have a different impact on the utilities of other agents, depending on the skills and the objectives of the agent. If these agents need other agents to reach their objectives, they will behave as altruistic individuals who internalise the utilities of other agents in reaching their objectives and will adopt cooperative behaviours. However, if these agents fear that other agents could prevent them from reaching their target because they can foresee that the skills of other agents are better adapted than their own skills, the agents will then behave as predator individuals and will adopt destructive behaviours to attain their objective. It is in the interests of these agents to manipulate information to increase disorder and dissimulate their lack of skills. They will reproduce the strategies of animals that modify their appearance to escape predators or simulate being bait to attract their prey. These agents will seek to induce chaos into the behaviours of other agents to amplify the impact of their strategies. The appropriateness of skills to the market allows an understanding of the emer-gence of networks and associated strategies. The members of a networks are inputs who are excluded when their costs are higher than their benefits. A network simul-taneously allows cooperation and selfish, predatory behaviours among its members. A network may adopt informational strategies when seeking to become the leader in a market or when it cannot survive. The creation of networks and the manipulation of information are two overlapping evolutionary strategies, with the first strategy favouring the second. In our model, an agent does not behave like a firm that aims only to maximise the profits of the firm but rather as a member of a network who adopts strategic behaviours as a function of the interests of this network. If his skills are well adapted to the market and he can innovate, he will not invest in erroneous input; in contrast, if his skills are not adapted, the agent will invest in the erroneous input of information into the market in order to survive. Therefore, when any informational asymmetries between the agents and their principals characterise the market, the price cannot be the main element that allows equilibrium to be reached in the market; instead, the appropriateness of skills to the market enables equilibrium. We will now apply these hypotheses to explain the strategic behaviours of physicians and pharmaceutical companies.
Resumo:
TERMINOLOGY AND PRINCIPLES OF COMBINING ANTIPSYCHOTICS WITH A SECOND MEDICATION: The term "combination" includes virtually all the ways in which one medication may be added to another. The other commonly used terms are "augmentation" which implies an additive effect from adding a second medicine to that obtained from prescribing a first, an "add on" which implies adding on to existing, possibly effective treatment which, for one reason or another, cannot or should not be stopped. The issues that arise in all potential indications are: a) how long it is reasonable to wait to prove insufficiency of response to monotherapy; b) by what criteria that response should be defined; c) how optimal is the dose of the first monotherapy and, therefore, how confident can one be that its lack of effect is due to a truly inadequate response? Before one considers combination treatment, one or more of the following criteria should be met; a) monotherapy has been only partially effective on core symptoms; b) monotherapy has been effective on some concurrent symptoms but not others, for which a further medicine is believed to be required; c) a particular combination might be indicated de novo in some indications; d) The combination could improve tolerability because two compounds may be employed below their individual dose thresholds for side effects. Regulators have been concerned primarily with a and, in principle at least, c above. In clinical practice, the use of combination treatment reflects the often unsatisfactory outcome of treatment with single agents. ANTIPSYCHOTICS IN MANIA: There is good evidence that most antipsychotics tested show efficacy in acute mania when added to lithium or valproate for patients showing no or a partial response to lithium or valproate alone. Conventional 2-armed trial designs could benefit from a third antipsychotic monotherapy arm. In the long term treatment of bipolar disorder, in patients responding acutely to the addition of quetiapine to lithium or valproate, this combination reduces the subsequent risk of relapse to depression, mania or mixed states compared to monotherapy with lithium or valproate. Comparable data is not available for combination with other antipsychotics. ANTIPSYCHOTICS IN MAJOR DEPRESSION: Some atypical antipsychotics have been shown to induce remission when added to an antidepressant (usually a SSRI or SNRI) in unipolar patients in a major depressive episode unresponsive to the antidepressant monotherapy. Refractoriness is defined as at least 6 weeks without meeting an adequate pre-defined treatment response. Long term data is not yet available to support continuing efficacy. SCHIZOPHRENIA: There is only limited evidence to support the combination of two or more antipsychotics in schizophrenia. Any monotherapy should be given at the maximal tolerated dose and at least two antipsychotics of different action/tolerability and clozapine should be given as a monotherapy before a combination is considered. The addition of a high potency D2/3 antagonist to a low potency antagonist like clozapine or quetiapine is the logical combination to treat positive symptoms, although further evidence from well conducted clinical trials is needed. Other mechanisms of action than D2/3 blockade, and hence other combinations might be more relevant for negative, cognitive or affective symptoms. OBSESSIVE-COMPULSIVE DISORDER: SSRI monotherapy has moderate overall average benefit in OCD and can take as long as 3 months for benefit to be decided. Antipsychotic addition may be considered in OCD with tic disorder and in refractory OCD. For OCD with poor insight (OCD with "psychotic features"), treatment of choice should be medium to high dose of SSRI, and only in refractory cases, augmentation with antipsychotics might be considered. Augmentation with haloperidol and risperidone was found to be effective (symptom reduction of more than 35%) for patients with tics. For refractory OCD, there is data suggesting a specific role for haloperidol and risperidone as well, and some data with regard to potential therapeutic benefit with olanzapine and quetiapine. ANTIPSYCHOTICS AND ADVERSE EFFECTS IN SEVERE MENTAL ILLNESS: Cardio-metabolic risk in patients with severe mental illness and especially when treated with antipsychotic agents are now much better recognized and efforts to ensure improved physical health screening and prevention are becoming established.
Resumo:
Requests all state departments and agencies to consider using Project Labor Agreements on large-scale construction projects to provide structure and stability, promote efficient, on-time completion and ensure the high standards and reasonable costs on the projects in order to move Iowa’s economy forward.
Resumo:
Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]