947 resultados para Access Management
Resumo:
A common property resource with open access, such as a fishery, will be used to excess when faced with sufficient demand. This will lead to an excessive amount of effort on the part of the fishery, resulting in a depletion of the stock. This paper discusses the development of a property rights regime for the Atlantic calico scallop, Argopecten gibbus, fishery of Florida. The management solution of the Calico Scallop Conservation Association (CSCA) provides an example of the assignment of property rights to a common property resource without resorting to governmental intervention. In this particular fishery, self-regulation limited early harvesting which would be uneconomic; there may be other fisheries in which self-regulation could be economically efficient and biologically appropriate. While this solution may not be applicable to all common property resources, for those cases which may be similar; the example of the CSCA provides valuable information that may be helpful in establishing a more efficient use of the resource. Some types of government facilitation may also be useful.
Resumo:
The charter boat industry in U. S. Gulf of Mexico provides access to offshore fishing opportunities for about 570,000 passengers per year on 971 boats. A 25% random sample of charter boat operators was interviewed during 1987-88 to determine species targeted, percent time committed to targeting each species, and reactions to existing catch restrictions. Three-fourths of the charter boat fleet was in Florida, 13% in Texas, 5% in Louisiana, 4% in Alabama, and 2% in Mississippi. Responses were diverse regarding species focus within the region. Species of dominant importance included groupers, Epinephelus sp. and Mycteroperca sp. (Fla.); snapper, Lutjanus campechanus (Ala., Fla., Miss., and La.); king mackerel, Scomberomorus cavalla (Miss., Tex., Ala. and Fla.); spotted seatrout, Cynoscion nebulosus (Tex. and La.); and red drum, Sciaenops ocellatus (Tex. and La). Catch restrictions were generally supported with higher levels of opposition to restricted high effort fish and/or one fish or closed fishery limits.
Resumo:
The passage of the Magnuson Fishery Conservation and Management Act of 1976 (MFCMA) and the establishment of a 200-mile exclusive economic zone (EEZ) in 1983 have resulted in a radical change in the pattern of foreign fishing operations off the U. S. coasts. Likewise, the extensions of 200-mile EEZ's by other nations have impacted U.S. distant-water fisheries. The result has been that a new international framework for fisheries is emerging and is continuing to evolve.
Resumo:
The study was conducted on the present status of HACCP based quality management system of golda, Macrobrachium rosenbergii farms in Fulpur region of Mymensingh. Information was collected on general condition of farms, culture systems and post-harvest quality management. In almost all farms, there is no or inadequate infrastructure facilities such as, road access, electric supply, telecommunications, ice, feed storage facility, vehicle for golda transportation, washing and toilet facilities. The problems associated with sanitation and hygiene was: widespread use of cow dung, poultry manure and construction of open toilet within the vicinity of prawn culture pond. Different grades of commercially available and locally prepared feeds were used for golda culture in the pond. Golda post-larvae (PL) of 40-50 days old were stocked with carp species. The price of golda PL ranged from Tk. 1.00 to Tk. 1.25/piece. The pond size varied from 50 decimal (0.2 ha) to 2.5 acre (1.0 ha) with an average depth of 2-2.5 m. The culture period of golda varied from April-May to November-December and survival rate ranged between 75 and 80%. Production of golda varied from 250-500 kg/acre (625-1,250 kg/ha). Harvested golda were transported to city market within 4 h. Two size grading were generally followed during pricing, e.g. Tk. 500 to 550/kg for >100 g size and Tk. 300/kg for <100 g size. The cost-benefit ratio was found to remain around 1:1.25 depending on availability of PL. Water quality parameters such as, water temperature, pH, dissolved oxygen, total alkalinity and chlorophyll a in five golda farms in Fulpur region were monitored. Water temperature ranged from 29°C to 33°C, dissolved oxygen from 2.28 to 4.13 mg/l, pH between 6.65 and 7.94, alkalinity from 44 to 70 mg/l and chlorophyll a concentration from 61.88 to 102.34 µg/l in the five investigated ponds. The Aerobic Plate Count (APC) of the water sample was within the range of 2.0x10^6 - 2.96x10^7 CFU/ml and of soil samples within the range of 6.9x10^6 - 7.73x10^6 CFU/g. Streptococcus sp., Bacillus sp., Escherichia coli, Staphylococcus sp., Pseudomonas sp. and Salmonella sp. were isolated from pond water and sediment. Different feed samples used for golda was analyzed for proximate composition. Moisture content ranged around 14.14-21.22%, crude protein 20.55-44.1%, lipid 4.67-12.54% and ash 9.7-27.69%. The TVB-N values and peroxide values of feeds used as starter, grower and fish meal were found within the acceptable ranges and samples were free from pathogenic organisms. A training was organized for the golda farmers on HACCP, water quality and post-harvest quality management of prawn.
Resumo:
To manage and process a large amount of oceanographic data, users must have powerful tools that simplify these tasks. The VODC for PC is software designed to assist in managing oceanographic data. It based on 32 bits Windows operation system and used Microsoft Access database management system. With VODC for PC users can update data simply, convert to some international data formats, combine some VODC databases to one, calculate average, min, max fields for some types of data, check for valid data…
Resumo:
Although previous research has widely acknowledged the phenomenon of film-induced tourism, there is a paucity of research in relation to management of film-induced tourism at built heritage sites. This research, underpinned by a constructivist paradigm, draws on three distinct fields of study – heritage tourism management, film-induced tourism and heritage interpretation – in order to provide a contribution to the heritage management field and address this particular gap in knowledge. Relying on the method of semi-structured interviews with managers, guides and visitors at Rosslyn Chapel (RC) and Alnwick Castle (AC), this thesis provides a rich understanding of how heritage interpretation can address a range of management challenges at heritage sites where film-induced tourism has occurred. These heritage visitor attractions (HVAs) were specifically selected as case studies as they have played different roles in media products. Rosslyn Chapel (RC) was an actual place named in The Da Vinci Code (TDVC) book and then film, whereas Alnwick Castle (AC) served as a backdrop for the first two Harry Potter (HP) films. Findings of this research include a range of management challenges at both RC and AC such as an increase in visitor numbers; seasonality issues; changes in visitor profile; revenue generation concerns; conservation, access, and visitor experience; and the complex relationship between heritage management and tourism activities. The findings also reveal film-induced tourism’s implications for heritage interpretation such as the various visitors’ expectations for heritage interpretation, changes to heritage interpretation as a result of film-induced tourism, and issues with commodification. These findings also demonstrate that film-induced tourism to some extent influenced visitors’ preferences for heritage interpretation, though visitors’ preferences differed from one to another. This thesis argues that, in the context of film-induced tourism at HVAs, as evident from the two case studies considered, heritage interpretation can be a valuable management tool and can also play a significant role in the quality of the visitors’ experience.
Resumo:
Durbin, J. & Urquhart, C. (2003). Qualitative evaluation of KA24 (Knowledge Access 24). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Knowledge Access 24 (NHS)
Resumo:
On January 11, 2008, the National Institutes of Health ('NIH') adopted a revised Public Access Policy for peer-reviewed journal articles reporting research supported in whole or in part by NIH funds. Under the revised policy, the grantee shall ensure that a copy of the author's final manuscript, including any revisions made during the peer review process, be electronically submitted to the National Library of Medicine's PubMed Central ('PMC') archive and that the person submitting the manuscript will designate a time not later than 12 months after publication at which NIH may make the full text of the manuscript publicly accessible in PMC. NIH adopted this policy to implement a new statutory requirement under which: The Director of the National Institutes of Health shall require that all investigators funded by the NIH submit or have submitted for them to the National Library of Medicine's PubMed Central an electronic version of their final, peer-reviewed manuscripts upon acceptance for publication to be made publicly available no later than 12 months after the official date of publication: Provided, That the NIH shall implement the public access policy in a manner consistent with copyright law. This White Paper is written primarily for policymaking staff in universities and other institutional recipients of NIH support responsible for ensuring compliance with the Public Access Policy. The January 11, 2008, Public Access Policy imposes two new compliance mandates. First, the grantee must ensure proper manuscript submission. The version of the article to be submitted is the final version over which the author has control, which must include all revisions made after peer review. The statutory command directs that the manuscript be submitted to PMC 'upon acceptance for publication.' That is, the author's final manuscript should be submitted to PMC at the same time that it is sent to the publisher for final formatting and copy editing. Proper submission is a two-stage process. The electronic manuscript must first be submitted through a process that requires input of additional information concerning the article, the author(s), and the nature of NIH support for the research reported. NIH then formats the manuscript into a uniform, XML-based format used for PMC versions of articles. In the second stage of the submission process, NIH sends a notice to the Principal Investigator requesting that the PMC-formatted version be reviewed and approved. Only after such approval has grantee's manuscript submission obligation been satisfied. Second, the grantee also has a distinct obligation to grant NIH copyright permission to make the manuscript publicly accessible through PMC not later than 12 months after the date of publication. This obligation is connected to manuscript submission because the author, or the person submitting the manuscript on the author's behalf, must have the necessary rights under copyright at the time of submission to give NIH the copyright permission it requires. This White Paper explains and analyzes only the scope of the grantee's copyright-related obligations under the revised Public Access Policy and suggests six options for compliance with that aspect of the grantee's obligation. Time is of the essence for NIH grantees. As a practical matter, the grantee should have a compliance process in place no later than April 7, 2008. More specifically, the new Public Access Policy applies to any article accepted for publication on or after April 7, 2008 if the article arose under (1) an NIH Grant or Cooperative Agreement active in Fiscal Year 2008, (2) direct funding from an NIH Contract signed after April 7, 2008, (3) direct funding from the NIH Intramural Program, or (4) from an NIH employee. In addition, effective May 25, 2008, anyone submitting an application, proposal or progress report to the NIH must include the PMC reference number when citing articles arising from their NIH funded research. (This includes applications submitted to the NIH for the May 25, 2008 and subsequent due dates.) Conceptually, the compliance challenge that the Public Access Policy poses for grantees is easily described. The grantee must depend to some extent upon the author(s) to take the necessary actions to ensure that the grantee is in compliance with the Public Access Policy because the electronic manuscripts and the copyrights in those manuscripts are initially under the control of the author(s). As a result, any compliance option will require an explicit understanding between the author(s) and the grantee about how the manuscript and the copyright in the manuscript are managed. It is useful to conceptually keep separate the grantee's manuscript submission obligation from its copyright permission obligation because the compliance personnel concerned with manuscript management may differ from those responsible for overseeing the author's copyright management. With respect to copyright management, the grantee has the following six options: (1) rely on authors to manage copyright but also to request or to require that these authors take responsibility for amending publication agreements that call for transfer of too many rights to enable the author to grant NIH permission to make the manuscript publicly accessible ('the Public Access License'); (2) take a more active role in assisting authors in negotiating the scope of any copyright transfer to a publisher by (a) providing advice to authors concerning their negotiations or (b) by acting as the author's agent in such negotiations; (3) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License; (4) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License and also grants a license to the grantee to make certain uses of the article, including posting a copy in the grantee's publicly accessible digital archive or repository and authorizing the article to be used in connection with teaching by university faculty; (5) negotiate a more systematic and comprehensive agreement with the biomedical publishers to ensure either that the publisher has a binding obligation to submit the manuscript and to grant NIH permission to make the manuscript publicly accessible or that the author retains sufficient rights to do so; or (6) instruct NIH-funded authors to submit manuscripts only to journals with binding deposit agreements with NIH or to journals whose copyright agreements permit authors to retain sufficient rights to authorize NIH to make manuscripts publicly accessible.
Resumo:
The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.
Resumo:
Existing Building/Energy Management Systems (BMS/EMS) fail to convey holistic performance to the building manager. A 20% reduction in energy consumption can be achieved by efficiently operated buildings compared with current practice. However, in the majority of buildings, occupant comfort and energy consumption analysis is primarily restricted by available sensor and meter data. Installation of a continuous monitoring process can significantly improve the building systems’ performance. We present WSN-BMDS, an IP-based wireless sensor network building monitoring and diagnostic system. The main focus of WSN-BMDS is to obtain much higher degree of information about the building operation then current BMSs are able to provide. Our system integrates a heterogeneous set of wireless sensor nodes with IEEE 802.11 backbone routers and the Global Sensor Network (GSN) web server. Sensing data is stored in a database at the back office via UDP protocol and can be access over the Internet using GSN. Through this demonstration, we show that WSN-BMDS provides accurate measurements of air-temperature, air-humidity, light, and energy consumption for particular rooms in our target building. Our interactive graphical user interface provides a user-friendly environment showing live network topology, monitor network statistics, and run-time management actions on the network. We also demonstrate actuation by changing the artificial light level in one of the rooms.
Resumo:
Buildings consume 40% of Ireland's total annual energy translating to 3.5 billion (2004). The EPBD directive (effective January 2003) places an onus on all member states to rate the energy performance of all buildings in excess of 50m2. Energy and environmental performance management systems for residential buildings do not exist and consist of an ad-hoc integration of wired building management systems and Monitoring & Targeting systems for non-residential buildings. These systems are unsophisticated and do not easily lend themselves to cost effective retrofit or integration with other enterprise management systems. It is commonly agreed that a 15-40% reduction of building energy consumption is achievable by efficiently operating buildings when compared with typical practice. Existing research has identified that the level of information available to Building Managers with existing Building Management Systems and Environmental Monitoring Systems (BMS/EMS) is insufficient to perform the required performance based building assessment. The cost of installing additional sensors and meters is extremely high, primarily due to the estimated cost of wiring and the needed labour. From this perspective wireless sensor technology provides the capability to provide reliable sensor data at the required temporal and spatial granularity associated with building energy management. In this paper, a wireless sensor network mote hardware design and implementation is presented for a building energy management application. Appropriate sensors were selected and interfaced with the developed system based on user requirements to meet both the building monitoring and metering requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks associated with minimisation of energy consumption in the built environment and the development of appropriate Building information models(BIM)to enable the design and development of energy efficient spaces.
Resumo:
Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.
Resumo:
Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.
Resumo:
The recognition that early breast cancer is a spectrum of diseases each requiring a specific systemic therapy guided the 13th St Gallen International Breast Cancer Consensus Conference [1]. The meeting assembled 3600 participants from nearly 90 countries worldwide. Educational content has been centred on the primary and multidisciplinary treatment approach of early breast cancer. The meeting culminated on the final day, with the St Gallen Breast Cancer Treatment Consensus, established by 40-50 of the world's most experienced opinion leaders in the field of breast cancer treatment. The major issue that arose during the consensus conference was the increasing gap between what is theoretically feasible in patient risk stratification, in treatment, and in daily practice management. We need to find new paths to access innovations to clinical research and daily practice. To ensure that continued innovation meets the needs of patients, the therapeutic alliance between patients and academic-led research should to be extended to include relevant pharmaceutical companies and drug regulators with a unique effort to bring innovation into clinical practice. We need to bring together major players from the world of breast cancer research to map out a coordinated strategy on an international scale, to address the disease fragmentation, to share financial resources, and to integrate scientific data. The final goal will be to improve access to an affordable, best standard of care for all patients in each country.