995 resultados para Mandatory Access Controls
Resumo:
This paper aims to estimate empirically the efficiency of a Swiss telemedicine service introduced in 2003. We used claims' data gathered by a major Swiss health insurer, over a period of 6 years and involving 160 000 insured adults. In Switzerland, health insurance is mandatory, but everyone has the option of choosing between a managed care plan and a fee-for-service plan. This paper focuses on a conventional fee-for-service plan including a mandatory access to a telemedicine service; the insured are obliged to phone this medical call centre before visiting a physician. This type of plan generates much lower average health expenditures than a conventional insurance plan. Reasons for this may include selection, incentive effects or efficiency. In our sample, about 90% of the difference in health expenditure can be explained by selection and incentive effects. The remaining 10% of savings due to the efficiency of the telemedicine service amount to about SFr 150 per year per insured, of which approximately 60% is saved by the insurer and 40% by the insured. Although the efficiency effect is greater than the cost of the plan, the big winners are the insured who not only save monetary and non-monetary costs but also benefit from reduced premiums. Copyright © 2010 John Wiley & Sons, Ltd.
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
Tämä kandidaatintutkielma on kirjallisuuskatsaus, joka käsittelee RFID-teknologian hyöndyntämistä kulunvalvonnassa. Työssä perehdytään pintapuolisesti itse teknologiaan, ja luodaan katsaus kulunvalvontaan. Työn pääaihe on kuitenkin kulunvalvonnan ja RFID:n yhdistyminen: miten RFID:tä hyödynnetään kulunvalvonnan toteutuksissa ympäri maailmaa. Työssä tarkastellaan RFID:n vahvuuksia, sekä heikkouksia kulunvalvonnan suhteen. Tämän lisäksi pyritään luomaan kuva nykyisistä ja tulevista implementaatioista. Viimeinen tärkeä työn osa-alue on turvallisuus. RFID:tä käytetään korkeankin turvatason kulunvalvontaratkaisuissa ja tällöin turvallisuuden maksimoiminen on ensiarvoisen tärkeää.
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process
Resumo:
With today's prevalence of Internet-connected systems storing sensitive data and the omnipresent threat of technically skilled malicious users, computer security remains a critically important field. Because of today's multitude of vulnerable systems and security threats, it is vital that computer science students be taught techniques for programming secure systems, especially since many of them will work on systems with sensitive data after graduation. Teaching computer science students proper design, implementation, and maintenance of secure systems is a challenging task that calls for the use of novel pedagogical tools. This report describes the implementation of a compiler that converts mandatory access control specification Domain-Type Enforcement Language to the Java Security Manager, primarily for pedagogical purposes. The implementation of the Java Security Manager was explored in depth, and various techniques to work around its inherent limitations were explored and partially implemented, although some of these workarounds do not appear in the current version of the compiler because they would have compromised cross-platform compatibility. The current version of the compiler and implementation details of the Java Security Manager are discussed in depth.
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process
Resumo:
This thesis reports on an investigation of the feasibility and usefulness of incorporating dynamic management facilities for managing sensed context data in a distributed contextaware mobile application. The investigation focuses on reducing the work required to integrate new sensed context streams in an existing context aware architecture. Current architectures require integration work for new streams and new contexts that are encountered. This means of operation is acceptable for current fixed architectures. However, as systems become more mobile the number of discoverable streams increases. Without the ability to discover and use these new streams the functionality of any given device will be limited to the streams that it knows how to decode. The integration of new streams requires that the sensed context data be understood by the current application. If the new source provides data of a type that an application currently requires then the new source should be connected to the application without any prior knowledge of the new source. If the type is similar and can be converted then this stream too should be appropriated by the application. Such applications are based on portable devices (phones, PDAs) for semi-autonomous services that use data from sensors connected to the devices, plus data exchanged with other such devices and remote servers. Such applications must handle input from a variety of sensors, refining the data locally and managing its communication from the device in volatile and unpredictable network conditions. The choice to focus on locally connected sensory input allows for the introduction of privacy and access controls. This local control can determine how the information is communicated to others. This investigation focuses on the evaluation of three approaches to sensor data management. The first system is characterised by its static management based on the pre-pended metadata. This was the reference system. Developed for a mobile system, the data was processed based on the attached metadata. The code that performed the processing was static. The second system was developed to move away from the static processing and introduce a greater freedom of handling for the data stream, this resulted in a heavy weight approach. The approach focused on pushing the processing of the data into a number of networked nodes rather than the monolithic design of the previous system. By creating a separate communication channel for the metadata it is possible to be more flexible with the amount and type of data transmitted. The final system pulled the benefits of the other systems together. By providing a small management class that would load a separate handler based on the incoming data, Dynamism was maximised whilst maintaining ease of code understanding. The three systems were then compared to highlight their ability to dynamically manage new sensed context. The evaluation took two approaches, the first is a quantitative analysis of the code to understand the complexity of the relative three systems. This was done by evaluating what changes to the system were involved for the new context. The second approach takes a qualitative view of the work required by the software engineer to reconfigure the systems to provide support for a new data stream. The evaluation highlights the various scenarios in which the three systems are most suited. There is always a trade-o↵ in the development of a system. The three approaches highlight this fact. The creation of a statically bound system can be quick to develop but may need to be completely re-written if the requirements move too far. Alternatively a highly dynamic system may be able to cope with new requirements but the developer time to create such a system may be greater than the creation of several simpler systems.
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process
Resumo:
Report on a review of selected general and application controls for the Iowa Department of Education’s Electronic Access System for Iowa Education Records (EASIER) system for the period April 4 through May 10, 2011
Resumo:
Xylanases (EC 3.2.1.8 endo-1,4-glycosyl hydrolase) catalyze the hydrolysis of xylan, an abundant hemicellulose of plant cell walls. Access to the catalytic site of GH11 xylanases is regulated by movement of a short beta-hairpin, the so-called thumb region, which can adopt open or closed conformations. A crystallographic study has shown that the D11F/R122D mutant of the GH11 xylanase A from Bacillus subtilis (BsXA) displays a stable "open" conformation, and here we report a molecular dynamics simulation study comparing this mutant with the native enzyme over a range of temperatures. The mutant open conformation was stable at 300 and 328 K, however it showed a transition to the closed state at 338 K. Analysis of dihedral angles identified thumb region residues Y113 and T123 as key hinge points which determine the open-closed transition at 338 K. Although the D11F/R122D mutations result in a reduction in local inter-intramolecular hydrogen bonding, the global energies of the open and closed conformations in the native enzyme are equivalent, suggesting that the two conformations are equally accessible. These results indicate that the thumb region shows a broader degree of energetically permissible conformations which regulate the access to the active site region. The R122D mutation contributes to the stability of the open conformation, but is not essential for thumb dynamics, i.e., the wild type enzyme can also adapt to the open conformation.
Resumo:
"B-226652"--P. 1.
Resumo:
Arantes GM, Arantes VMN, Ashmawi HA, Posso IP To study the efficacy of tenoxicam for pain control, its potential for preemptive analgesia, and its influence on the orthodontic movement of upper canine teeth. This was a randomized controlled double-blind cross-over study. The patients were divided into three groups. Two groups received tenoxicam in daily doses of 20 mg orally for 3 days. Group A received the first dose of the drug before orthodontic activation and group B, just afterwards. Group C (control) received a placebo for 3 days. All groups had access to 750 mg of paracetamol up to four times a day. Three orthodontic activations were performed at 30-day intervals. Each patient belonged to two different groups. Pain intensity was assessed using a descriptive Pain Scale and a Visual Analog Scale. Private clinic; 36 patients undergoing bilateral canine tooth retraction. The statistical analysis did not show any difference in movement between the active groups and the control at any time. There was no statistical difference between the groups that received tenoxicam. Pain intensity in these groups was lower than in the placebo group. The difference in pain intensity between the active groups and the control was greatest at the assessment made 12 h after activation and it tended to zero, 72 h after activation. Tenoxicam did not influence orthodontic movement of the upper canines. It was effective for pain control and did not present any preemptive analgesic effect.
Resumo:
Context Previous studies have reported that early initiation of cannabis (marijuana) use is a significant risk factor for other drug use and drug-related problems. Objective To examine whether the association between early cannabis use and subsequent progression to use of other drugs and drug abuse/dependence persists after controlling for genetic and shared environmental influences. Design Cross-sectional survey conducted in 1996-2000 among an Australian national volunteer sample of 311 young adult (median age, 30 years) monozygotic and dizygotic same-sex twin pairs discordant for early cannabis use (before age 17 years). Main Outcome Measures Self-reported subsequent nonmedical use of prescription sedatives, hallucinogens, cocaine/other stimulants, and opioids; abuse or dependence on these drugs (including cannabis abuse/dependence); and alcohol dependence. Results Individuals who used cannabis by age 17 years had odds of other drug use, alcohol dependence, and drug abuse/dependence that were 2.1 to 5.2 times higher than those of their co-twin, who did not use cannabis before age 17 years. Controlling for known risk factors (early-onset alcohol or tobacco use, parental conflict/separation, childhood sexual abuse, conduct disorder, major depression, and social anxiety) had only negligible effects on these results. These associations did not differ significantly between monozygotic and dizygotic twins. Conclusions Associations between early cannabis use and later drug use and abuse/dependence cannot solely be explained by common predisposing genetic or shared environmental factors. The association may arise from the effects of the peer and social context within which cannabis is used and obtained. In particular, early access to and use of cannabis may reduce perceived barriers against the use of other illegal drugs and provide access to these drugs.
Resumo:
In this article, Médicos Sin Fronteras (MSF) Spain faces the challenge of selecting, piecing together, and conveying in the clearest possible way, the main lessons learnt over the course of the last seven years in the world of medical care for Chagas disease. More than two thousand children under the age of 14 have been treated; the majority of whom come from rural Latin American areas with difficult access. It is based on these lessons learnt, through mistakes and successes, that MSF advocates that medical care for patients with Chagas disease be a reality, in a manner which is inclusive (not exclusive), integrated (with medical, psychological, social, and educational components), and in which the patient is actively followed. This must be a multi-disease approach with permanent quality controls in place based on primary health care (PHC). Rapid diagnostic tests and new medications should be available, as well as therapeutic plans and patient management (including side effects) with standardised flows for medical care for patients within PHC in relation to secondary and tertiary level, inclusive of epidemiological surveillance systems.