963 resultados para Package
Resumo:
The effects of high-pressure processing (HPP) in conjunction with an essential oil-based active packaging on the surface of ready-to-eat (RTE) chicken breast were investigated as post-processing listericidal treatment. Three different treatments were used, and all samples were vacuum packed: (i) HPP at 500. MPa for 1. min (control), (ii) active packaging based on coriander essential oil, and (iii) active packaging and HPP. When applied individually, active packaging and pressurisation delayed the growth of Listeria monocytogenes. The combination of HPP and active packaging resulted in a synergistic effect reducing the counts of the pathogen below the detection limit throughout 60. days storage at 4. °C. However, when these samples were stored at 8. °C, growth did occur, but again a delay in growth was observed. The effects on colour and lipid oxidation were also studied during storage and were not significantly affected by the treatments. Active packaging followed by in-package pressure treatment could be a useful approach to reduce the risk of L. monocytogenes in cooked chicken without impairing its quality. Industrial relevance: Ready-to-eat products are of great economic importance to the industry. However, they have been implicated in several outbreaks of listeriosis. Therefore, effective ways to reduce the risk from this pathogenic microorganism can be very attractive for manufacturers. This study showed that the use of active packaging followed by HPP can enhance the listericidal efficiency of the treatment while using lower pressure levels, and thus having limited effects on colour and lipid oxidation of RTE chicken breast.
Resumo:
The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.
Resumo:
RATIONALE, AIMS AND OBJECTIVES: Health care services offered to the public should be based on the best available evidence. We aimed to explore pharmacy tutors' and trainees' views on the importance of evidence when making decisions about over-the-counter (OTC) medicines and also to investigate whether the tutor influenced the trainee in practice.
METHODS: Following ethical approval and piloting, semi-structured interviews were conducted with pharmacy graduates (trainees) and pharmacist tutors. Transcribed interview data were entered into the NVivo software package (version 10), coded and analysed via thematic analysis.
RESULTS: Twelve trainees (five males, seven females) and 11 tutors (five males, six females) participated. Main themes that emerged were (in)consistency and contradiction, confidence, acculturation, and continuation and perpetuation. Despite having an awareness of the importance and potential benefits, an evidence-based approach did not seem to be routinely or consistently implemented in practice. Confidence in products was largely derived from personal use and patient feedback. A lack of discussion about evidence was justified on the basis of not wanting to lessen patient confidence in requested product(s) or possibly negating the placebo effect. Trainees became acculturated to 'real-life' practice; university teaching and evidence was deemed less relevant than meeting customer expectations. The tutor's actions were mirrored by their trainee resulting in continuation and perpetuation of the same professional attitudes and behaviours.
CONCLUSIONS: Evidence appeared to have limited influence on OTC decision making. The tutor played a key role in the trainee's professional development. More work could be performed to investigate how evidence can be regarded as relevant and something that is consistently implemented in practice.
Resumo:
Bottom hinged oscillating wave surge converters are known to be an efficient method of extracting power from ocean waves. The present work deals with experimental and numerical studies of wave interactions with an oscillating wave surge converter. It focuses on two aspects: (1) viscous effects on device performance under normal operating conditions; and (2) effects of slamming on device survivability under extreme conditions. Part I deals with the viscous effects while the extreme sea conditions will be presented in Part II. The numerical simulations are performed using the commercial CFD package ANSYS FLUENT. The comparison between numerical results and experimental measurements shows excellent agreement in terms of capturing local features of the flow as well as the dynamics of the device. A series of simulations is conducted with various wave conditions, flap configurations and model scales to investigate the viscous and scaling effects on the device. It is found that the diffraction/radiation effects dominate the device motion and that the viscous effects are negligible for wide flaps.
Resumo:
OBJECTIVES: The aim of this study was to investigate if a minimally invasive oral health package with the use of atraumatic restorative treatment (ART) or a conventional restorative technique (CT) would result in any perceived benefit from the patients' perspective and if there would be any difference between the two treatment groups.
MATERIALS AND METHODS: In this randomised clinical trial, 99 independently living older adults (65-90 years) with carious lesions were randomly allocated to receive either ART or conventional restorations using minimally invasive/intervention dentistry (MID) principles. Patients completed an Oral Health Impact Profile (OHIP)-14 questionnaire before and 2 months after treatment. They were also asked to complete a global transition question about their oral health after treatment.
RESULTS: At baseline, the mean OHIP-14 scores recorded were 7.34 (ART) and 7.44 (CT). Two months after treatment intervention, 90 patients answered the OHIP-14 and the mean scores were 7.23 (not significant (n.s.)) and 10.38 (n.s.) for the ART and CT groups, respectively. Overall, 75.5 % of patients stated that their oral health was better compared to the beginning of treatment.
CONCLUSIONS: Although not shown by the OHIP-14, patients perceived an improvement in their overall oral status after treatment, as demonstrated by the global transition ratings in both groups.
CLINICAL RELEVANCE: Dental treatment using minimally invasive techniques might be a good alternative to treat older individuals, and it can improve their oral health both objectively and subjectively.
Resumo:
The new Food Information Regulation (1169/2011), dictates that in a refined vegetable oil blend, the type of oil must be clearly identified in the package in contract with current practice where is labelled under the generic and often misleading term “vegetable oil”. With increase consumer awareness in food authenticity, as shown in the recent food scandal with horsemeat in beef products, the identification of the origin of species in food products becomes increasingly relevant. Palm oil is used extensively in food manufacturing and as global demand increases, producing countries suffer from the aftermath of intensive agriculture. Even if only a small portion of global production, sustainable palm oil comes in great demand from consumers and industry. It is therefore of interest to detect the presence of palm oil in food products as consumers have the right to know if it is present in the product or not, mainly from an ethical point of view. Apart from palm oil and its derivatives, rapeseed oil and sunflower oil are also included. With DNA-based methods, the gold standard for the detection of food authenticity and species recognition deemed not suitable in this analytical problem, the focus is inevitably drawn to the chromatographic and spectroscopic methods. Both chromatographic (such as GC-FID and LC-MS) and spectroscopic methods (FT-IR, Raman, NIR) are relevant. Previous attempts have not shown promising results due to oils’ natural variation in composition and complex chemical signals but the suggested two-step analytical procedure is a promising approach with very good initial results.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.
Resumo:
Midwifery educators are challenged to produce registrants who are fit for practice at the point of registration with competence at the heart of this expectation. In addition to achieving expertise in normal pregnancy, it is recognised that students need to have the skills of critical decision making where normal processes become adversely affected.
An evaluation was undertaken with final year direct entry midwifery students using questionnaires and focus group interviews to determine whether simulated learning, such as the Practical Obstetric Multi-Professional Training (PROMPT) package, for emergency obstetric training would enhance self-efficacy and confidence levels in preparation for post-registration practice. The main themes that emerged from the study indicate that this style of learning increased midwifery students’ feelings of self-efficacy; highlighted the importance of a safe learning environment; reduced their anxiety regarding their ability to make decisions in clinical practice and reinforced confidence in their level of knowledge.
Resumo:
A parametric regression model for right-censored data with a log-linear median regression function and a transformation in both response and regression parts, named parametric Transform-Both-Sides (TBS) model, is presented. The TBS model has a parameter that handles data asymmetry while allowing various different distributions for the error, as long as they are unimodal symmetric distributions centered at zero. The discussion is focused on the estimation procedure with five important error distributions (normal, double-exponential, Student's t, Cauchy and logistic) and presents properties, associated functions (that is, survival and hazard functions) and estimation methods based on maximum likelihood and on the Bayesian paradigm. These procedures are implemented in TBSSurvival, an open-source fully documented R package. The use of the package is illustrated and the performance of the model is analyzed using both simulated and real data sets.
Resumo:
Background: Pedigree reconstruction using genetic analysis provides a useful means to estimate fundamental population biology parameters relating to population demography, trait heritability and individual fitness when combined with other sources of data. However, there remain limitations to pedigree reconstruction in wild populations, particularly in systems where parent-offspring relationships cannot be directly observed, there is incomplete sampling of individuals, or molecular parentage inference relies on low quality DNA from archived material. While much can still be inferred from incomplete or sparse pedigrees, it is crucial to evaluate the quality and power of available genetic information a priori to testing specific biological hypotheses. Here, we used microsatellite markers to reconstruct a multi-generation pedigree of wild Atlantic salmon (Salmo salar L.) using archived scale samples collected with a total trapping system within a river over a 10 year period. Using a simulation-based approach, we determined the optimal microsatellite marker number for accurate parentage assignment, and evaluated the power of the resulting partial pedigree to investigate important evolutionary and quantitative genetic characteristics of salmon in the system.
Results: We show that at least 20 microsatellites (ave. 12 alleles/locus) are required to maximise parentage assignment and to improve the power to estimate reproductive success and heritability in this study system. We also show that 1.5 fold differences can be detected between groups simulated to have differing reproductive success, and that it is possible to detect moderate heritability values for continuous traits (h(2) similar to 0.40) with more than 80% power when using 28 moderately to highly polymorphic markers.
Conclusion: The methodologies and work flow described provide a robust approach for evaluating archived samples for pedigree-based research, even where only a proportion of the total population is sampled. The results demonstrate the feasibility of pedigree-based studies to address challenging ecological and evolutionary questions in free-living populations, where genealogies can be traced only using molecular tools, and that significant increases in pedigree assignment power can be achieved by using higher numbers of markers.
Resumo:
The injection stretch blow moulding process involves the inflation and stretching of a hot preform into a mould to form bottles. A critical process variable and an essential input for process simulations is the rate of pressure increase within the preform during forming, which is regulated by an air flow restrictor valve. The paper describes a set of experiments for measuring the air flow rate within an industrial ISBM machine and the subsequent modelling of it with the FEA package ABAQUS. Two rigid containers were inserted into a Sidel SBO1 blow moulding machine and subjected to different supply pressures and air flow restrictor settings. The pressure and air temperature were recorded for each experiment enabling the mass flow rate of air to be determined along with an important machine characteristic known as the ‘dead volume’. The experimental setup was simulated within the commercial FEA package ABAQUS/Explicit using a combination of structural, fluid and fluid link elements that idealize the air flowing through an orifice behaving as an ideal gas under isothermal conditions. Results between experiment and simulation are compared and show a good correlation.
Resumo:
Objectives: This study aimed to gather data on the nutritional status of older patients attending Cork University Dental School and Hospital for treatment in the Restorative Department. Information was also collected about the medical status of the patients including the prevalence of self-reported xerostomia.
Methods: Data was collected by a self-completion questionnaire followed by a brief clinical examination. Nutritional Status was measured using the short version of the Mini Nutritional Assessment (MNA) which recorded patients’ Body Mass Index (BMI). The MNA consists of 6 parameters (including questions relating to patients’ history and anthropometric data) with a maximum total of 14 points. Scores of 12-14 indicate “normal nutritional status” whilst those between 8 and 11 indicate a patient “at risk of malnutrition”. Scores lower than 8 are an indication a patient who is “malnourished”. All patients attending Cork University Dental School and Hospital aged 65 years and older were invited to participate in the study.
Results: A total of 22 subjects participated in this study. Twelve patients were partially dentate with 10 edentulous. The results from the MNA indicate that 11 patients were of “normal nutritional status” with 11 patients identified as being “at risk of malnutrition”. None of the subjects were “malnourished”. Edentate patients generally recorded lower MNA scores than partially dentate patients. In total, 9 patients reported experiencing xerostomia with 8 indicating that they needed to sip liquids to aid swallowing but only 3 had difficulty swallowing food.
Conclusion: This small study indicates that a number of the older patients attending Cork University Dental School and Hospital for dental care may be “at risk of malnutrition”. These findings suggest that nutritional advice and dental care should both be included in an overall package of care for older patients.
Resumo:
Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The intralaminar damage model component accounts for physically-based tensile and compressive failure mechanisms, of the fibres and matrix, when subjected to a three-dimensional stress state. Cohesive behaviour was employed to model the interlaminar failure between plies with a bi-linear traction–separation law for capturing damage onset and subsequent damage evolution. The virtual tests, set up in ABAQUS/Explicit, were executed in three steps, one to capture the impact damage, the second to stabilize the specimen by imposing new boundary conditions required for compression testing, and the third to predict the CAI strength. The observed intralaminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing without the need of model calibration which is often required with other damage models.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.
Resumo:
Energies and lifetimes are reported for the lowest 375 levels of five Br-like ions, namely SrIV, YV, ZrVI, NbVII, and MoVIII, mostly belonging to the 4s<sup>2</sup>4p<sup>5</sup>, 4s<sup>2</sup>4p<sup>4</sup>4ℓ, 4s4p<sup>6</sup>, 4s<sup>2</sup>4p<sup>4</sup>5ℓ, 4s<sup>2</sup>4p<sup>3</sup>4d<sup>2</sup>, 4s4p<sup>5</sup>4ℓ, and 4s4p<sup>5</sup>5ℓ configurations. Extensive configuration interaction has been included and the general-purpose relativistic atomic structure package (grasp) has been adopted for the calculations. Additionally, radiative rates are listed among these levels for all E1, E2, M1, and M2 transitions. From a comparison with the measurements, the majority of our energy levels are assessed to be accurate to better than 2%, although discrepancies between theory and experiment for a few are up to 6%. An accuracy assessment of the calculated radiative rates (and lifetimes) is more difficult, because no prior results exist for these ions.