167 resultados para Distribuição odd log-logística half-normal generalizada


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we define and present a comprehensive classification of user intent for Web searching. The classification consists of three hierarchical levels of informational, navigational, and transactional intent. After deriving attributes of each, we then developed a software application that automatically classified queries using a Web search engine log of over a million and a half queries submitted by several hundred thousand users. Our findings show that more than 80% of Web queries are informational in nature, with about 10% each being navigational and transactional. In order to validate the accuracy of our algorithm, we manually coded 400 queries and compared the results from this manual classification to the results determined by the automated method. This comparison showed that the automatic classification has an accuracy of 74%. Of the remaining 25% of the queries, the user intent is vague or multi-faceted, pointing to the need for probabilistic classification. We discuss how search engines can use knowledge of user intent to provide more targeted and relevant results in Web searching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the validity of a Gabor filter bank for feature extraction of solder joint images on Printed Circuit Boards (PCBs). A distance measure based on the Mahalanobis Cosine metric is also presented for classification of five different types of solder joints. From the experimental results, this methodology achieved high accuracy and a well generalised performance. This can be an effective method to reduce cost and improve quality in the production of PCBs in the manufacturing industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The word “queer” is a slippery one; its etymology is uncertain, and academic and popular usage attributes conflicting meanings to the word. By the mid-nineteenth century, “queer” was used as a pejorative term for a (male) homosexual. This negative connotation continues when it becomes a term for homophobic abuse. In recent years, “queer” has taken on additional uses: as an all encompassing term for culturally marginalised sexualities – gay, lesbian, trans, bi, and intersex (“GLBTI”) – and as a theoretical strategy which deconstructs binary oppositions that govern identity formation. Tracing its history, the Oxford English Dictionary notes that the earliest references to “queer” may have appeared in the sixteenth century. These early examples of queer carried negative connotations such as “vulgar,” “bad,” “worthless,” “strange,” or “odd” and such associations continued until the mid-twentieth century. The early nineteenth century, and perhaps earlier, employed “queer” as a verb, meaning to “to put out of order,” “to spoil”, “to interfere with”. The adjectival form also began to emerge during this time to refer to a person’s condition as being “not normal,” “out of sorts” or to cause a person “to feel queer” meaning “to disconcert, perturb, unsettle.” According to Eve Sedgwick (1993), “the word ‘queer’ itself means across – it comes from the Indo-European root – twerkw, which also yields the German quer (traverse), Latin torquere (to twist), English athwart . . . it is relational and strange.” Despite the gaps in the lineage and changes in usage, meaning and grammatical form, “queer” as a political and theoretical strategy has benefited from its diverse origins. It refuses to settle comfortably into a single classification, preferring instead to traverse several categories that would otherwise attempt to stabilise notions of chromosomal sex, gender and sexuality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Students with low vision may be disadvantaged when compared with their normally sighted peers, as they frequently work at very short working distances and need to use low vision devices. The aim of this study was to examine the sustained reading rates of students with low vision and compare them with their peers with normal vision. The effects of visual acuity, acuity reserve and age on reading rate were also examined. Method: Fifty-six students (10 to 16 years of age), 26 with low vision and 30 with normal vision were required to read text continuously for 30 minutes. Their position in the text was recorded at two-minute intervals. Distance and near visual acuity, working distance, cause of low vision, reading rates and reading habits were recorded. Results: A total of 80.7 per cent of the students with low vision maintained a constant reading rate during the 30 minutes of reading, although they read at approximately half the rate (104 wpm) compared with their normally sighted peers (195 wpm). Only four of the low vision subjects could not complete the reading task. Reading rates increased significantly with acuity reserve and distance and near visual acuity but there was no significant relationship between age and sustained reading rate. Conclusions: The majority of students with low vision were able to maintain appropriate reading rates to cope in integrated educational settings. Surprisingly only relatively few subjects (16 per cent) used their prescribed low vision devices even though the average accommodative demand was 9 D and generally, they revealed a greater dislike of reading compared to students with normal vision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is the first to investigate the effect of prolonged reading on reading performance and visual functions in students with low vision. The study focuses on one of the most common modes of achieving adequate magnification for reading by students with low vision, their close reading distance (proximal or relative distance magnification). Close reading distances impose high demands on near visual functions, such as accommodation and convergence. Previous research on accommodation in children with low vision shows that their accommodative responses are reduced compared to normal vision. In addition, there is an increased lag of accommodation for higher stimulus levels as may occur at close reading distance. Reduced accommodative responses in low vision and higher lag of accommodation at close reading distances together could impact on reading performance of students with low vision especially during prolonged reading tasks. The presence of convergence anomalies could further affect reading performance. Therefore, the aims of the present study were 1) To investigate the effect of prolonged reading on reading performance in students with low vision 2) To investigate the effect of prolonged reading on visual functions in students with low vision. This study was conducted as cross-sectional research on 42 students with low vision and a comparison group of 20 students with normal vision, aged 7 to 20 years. The students with low vision had vision impairments arising from a range of causes and represented a typical group of students with low vision, with no significant developmental delays, attending school in Brisbane, Australia. All participants underwent a battery of clinical tests before and after a prolonged reading task. An initial reading-specific history and pre-task measurements that included Bailey-Lovie distance and near visual acuities, Pelli-Robson contrast sensitivity, ocular deviations, sensory fusion, ocular motility, near point of accommodation (pull-away method), accuracy of accommodation (Monocular Estimation Method (MEM)) retinoscopy and Near Point of Convergence (NPC) (push-up method) were recorded for all participants. Reading performance measures were Maximum Oral Reading Rates (MORR), Near Text Visual Acuity (NTVA) and acuity reserves using Bailey-Lovie text charts. Symptoms of visual fatigue were assessed using the Convergence Insufficiency Symptom Survey (CISS) for all participants. Pre-task measurements of reading performance and accuracy of accommodation and NPC were compared with post-task measurements, to test for any effects of prolonged reading. The prolonged reading task involved reading a storybook silently for at least 30 minutes. The task was controlled for print size, contrast, difficulty level and content of the reading material. Silent Reading Rate (SRR) was recorded every 2 minutes during prolonged reading. Symptom scores and visual fatigue scores were also obtained for all participants. A visual fatigue analogue scale (VAS) was used to assess visual fatigue during the task, once at the beginning, once at the middle and once at the end of the task. In addition to the subjective assessments of visual fatigue, tonic accommodation was monitored using a photorefractor (PlusoptiX CR03™) every 6 minutes during the task, as an objective assessment of visual fatigue. Reading measures were done at the habitual reading distance of students with low vision and at 25 cms for students with normal vision. The initial history showed that the students with low vision read for significantly shorter periods at home compared to the students with normal vision. The working distances of participants with low vision ranged from 3-25 cms and half of them were not using any optical devices for magnification. Nearly half of the participants with low vision were able to resolve 8-point print (1M) at 25 cms. Half of the participants in the low vision group had ocular deviations and suppression at near. Reading rates were significantly reduced in students with low vision compared to those of students with normal vision. In addition, there were a significantly larger number of participants in the low vision group who could not sustain the 30-minute task compared to the normal vision group. However, there were no significant changes in reading rates during or following prolonged reading in either the low vision or normal vision groups. Individual changes in reading rates were independent of their baseline reading rates, indicating that the changes in reading rates during prolonged reading cannot be predicted from a typical clinical assessment of reading using brief reading tasks. Contrary to previous reports the silent reading rates of the students with low vision were significantly lower than their oral reading rates, although oral and silent reading was assessed using different methods. Although the visual acuity, contrast sensitivity, near point of convergence and accuracy of accommodation were significantly poorer for the low vision group compared to those of the normal vision group, there were no significant changes in any of these visual functions following prolonged reading in either group. Interestingly, a few students with low vision (n =10) were found to be reading at a distance closer than their near point of accommodation. This suggests a decreased sensitivity to blur. Further evaluation revealed that the equivalent intrinsic refractive errors (an estimate of the spherical dioptirc defocus which would be expected to yield a patient’s visual acuity in normal subjects) were significantly larger for the low vision group compared to those of the normal vision group. As expected, accommodative responses were significantly reduced for the low vision group compared to the expected norms, which is consistent with their close reading distances, reduced visual acuity and contrast sensitivity. For those in the low vision group who had an accommodative error exceeding their equivalent intrinsic refractive errors, a significant decrease in MORR was found following prolonged reading. The silent reading rates however were not significantly affected by accommodative errors in the present study. Suppression also had a significant impact on the changes in reading rates during prolonged reading. The participants who did not have suppression at near showed significant decreases in silent reading rates during and following prolonged reading. This impact of binocular vision at near on prolonged reading was possibly due to the high demands on convergence. The significant predictors of MORR in the low vision group were age, NTVA, reading interest and reading comprehension, accounting for 61.7% of the variances in MORR. SRR was not significantly influenced by any factors, except for the duration of the reading task sustained; participants with higher reading rates were able to sustain a longer reading duration. In students with normal vision, age was the only predictor of MORR. Participants with low vision also reported significantly greater visual fatigue compared to the normal vision group. Measures of tonic accommodation however were little influenced by visual fatigue in the present study. Visual fatigue analogue scores were found to be significantly associated with reading rates in students with low vision and normal vision. However, the patterns of association between visual fatigue and reading rates were different for SRR and MORR. The participants with low vision with higher symptom scores had lower SRRs and participants with higher visual fatigue had lower MORRs. As hypothesized, visual functions such as accuracy of accommodation and convergence did have an impact on prolonged reading in students with low vision, for students whose accommodative errors were greater than their equivalent intrinsic refractive errors, and for those who did not suppress one eye. Those students with low vision who have accommodative errors higher than their equivalent intrinsic refractive errors might significantly benefit from reading glasses. Similarly, considering prisms or occlusion for those without suppression might reduce the convergence demands in these students while using their close reading distances. The impact of these prescriptions on reading rates, reading interest and visual fatigue is an area of promising future research. Most importantly, it is evident from the present study that a combination of factors such as accommodative errors, near point of convergence and suppression should be considered when prescribing reading devices for students with low vision. Considering these factors would also assist rehabilitation specialists in identifying those students who are likely to experience difficulty in prolonged reading, which is otherwise not reflected during typical clinical reading assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To assess the effect of graded increases in exercised-induced energy expenditure (EE) on appetite, energy intake (EI), total daily EE and body weight in men living in their normal environment and consuming their usual diets. Design: Within-subject, repeated measures design. Six men (mean (s.d.) age 31.0 (5.0) y; weight 75.1 (15.96) kg; height 1.79 (0.10) m; body mass index (BMI) 23.3(2.4) kg/m2), were each studied three times during a 9 day protocol, corresponding to prescriptions of no exercise, (control) (Nex; 0 MJ/day), medium exercise level (Mex; ~1.6 MJ/day) and high exercise level (Hex; ~3.2 MJ/day). On days 1-2 subjects were given a medium fat (MF) maintenance diet (1.6 ´ resting metabolic rate (RMR)). Measurements: On days 3-9 subjects self-recorded dietary intake using a food diary and self-weighed intake. EE was assessed by continual heart rate monitoring, using the modified FLEX method. Subjects' HR (heart rate) was individually calibrated against submaximal VO2 during incremental exercise tests at the beginning and end of each 9 day study period. Respiratory exchange was measured by indirect calorimetry. Subjects completed hourly hunger ratings during waking hours to record subjective sensations of hunger and appetite. Body weight was measured daily. Results: EE amounted to 11.7, 12.9 and 16.8 MJ/day (F(2,10)=48.26; P<0.001 (s.e.d=0.55)) on the Nex, Mex and Hex treatments, respectively. The corresponding values for EI were 11.6, 11.8 and 11.8 MJ/day (F(2,10)=0.10; P=0.910 (s.e.d.=0.10)), respectively. There were no treatment effects on hunger, appetite or body weight, but there was evidence of weight loss on the Hex treatment. Conclusion: Increasing EE did not lead to compensation of EI over 7 days. However, total daily EE tended to decrease over time on the two exercise treatments. Lean men appear able to tolerate a considerable negative energy balance, induced by exercise, over 7 days without invoking compensatory increases in EI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ROLE OF LOW AFFINITY β1-ADRENERGIC RECEPTOR IN NORMAL AND DISEASED HEARTS Background: The β1-adrenergic receptor (AR) has at least two binding sites, 1HAR and 1LAR (high and low affinity site of the 1AR respectively) which cause cardiostimulation. Some β-blockers, for example (-)-pindolol and (-)-CGP 12177 can activate β1LAR at higher concentrations than those required to block β1HAR. While β1HAR can be blocked by all clinically used β-blockers, β1LAR is relatively resistant to blockade. Thus, chronic β1LAR activation may occur in the setting of β-blocker therapy, thereby mediating persistent βAR signaling. Thus, it is important to determine the potential significance of β1LAR in vivo, particularly in disease settings. Method and result: C57Bl/6 male mice were used. Chronic (4 weeks) β1LAR activation was achieved by treatment with (-)-CGP12177 via osmotic minipump. Cardiac function was assessed by echocardiography and catheterization. (-)-CGP12177 treatment in healthy mice increased heart rate and left ventricular (LV) contractility without detectable LV remodelling or hypertrophy. In mice subjected to an 8-week period of aorta banding, (-)-CGP12177 treatment given during 4-8 weeks led to a positive inotropic effect. (-)-CGP12177 treatment exacerbated LV remodelling indicated by a worsening of LV hypertrophy by ??% (estimated by weight, wall thickness, cardiomyocyte size) and interstitial/perivascular fibrosis (by histology). Importantly, (-)-CGP12177 treatment to aorta banded mice exacerbated cardiac expression of hypertrophic, fibrogenic and inflammatory genes (all p<0.05 vs. non-treated control with aorta banding).. Conclusion: β1LAR activation provides functional support to the heart, in both normal and diseased (pressure overload) settings. Sustained β1LAR activation in the diseased heart exacerbates LV remodelling and therefore may promote disease progression from compensatory hypertrophy to heart failure. Word count: 270

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Previous studies have shown the influence of subchondral bone osteoblasts (SBOs) on phenotypical changes of articular cartilage chondrocytes (ACCs) during the development of osteoarthritis (OA). The molecular mechanisms involved during this process remain elusive, in particular, the signal transduction pathways. The aim of this study was to investigate the in vitro effects of OA SBOs on the phenotypical changes in normal ACCs and to unveil the potential involvement of MAPK signaling pathways during this process. Methods. Normal and arthritic cartilage and bone samples were collected for isolation of ACCs and SBOs. Direct and indirect coculture models were applied to study chondrocyte hypertrophy under the influence of OA SBOs. MAPKs in the regulation of the cell–cell interactions were monitored by phosphorylated antibodies and relevant inhibitors. Results. OA SBOs led to increased hypertrophic gene expression and matrix calcification in ACCs by means of both direct and indirect cell–cell interactions. In this study, we demonstrated for the first time that OA SBOs suppressed p38 phosphorylation and induced ERK-1/2 signal phosphorylation in cocultured ACCs. The ERK-1/2 pathway inhibitor PD98059 significantly attenuated the hypertrophic changes induced by conditioned medium from OA SBOs, and the p38 inhibitor SB203580 resulted in the up-regulation of hypertrophic genes in ACCs. Conclusion. The findings of this study suggest that the pathologic interaction of OA SBOs and ACCs is mediated via the activation of ERK-1/2 phosphorylation and deactivation of p38 phosphorylation, resulting in hypertrophic differentiation of ACCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Executive summary Objective: The aims of this study were to identify the impact of Pandemic (H1N1) 2009 Influenza on Australian Emergency Departments (EDs) and their staff, and to inform planning, preparedness, and response management arrangements for future pandemics, as well as managing infectious patients presenting to EDs in everyday practice. Methods This study involved three elements: 1. The first element of the study was an examination of published material including published statistics. Standard literature research methods were used to identify relevant published articles. In addition, data about ED demand was obtained from Australian Government Department of Health and Ageing (DoHA) publications, with several state health departments providing more detailed data. 2. The second element of the study was a survey of Directors of Emergency Medicine identified with the assistance of the Australasian College for Emergency Medicine (ACEM). This survey retrieved data about demand for ED services and elicited qualitative comments on the impact of the pandemic on ED management. 3. The third element of the study was a survey of ED staff. A questionnaire was emailed to members of three professional colleges—the ACEM; the Australian College of Emergency Nursing (ACEN); and the College of Emergency Nursing Australasia (CENA). The overall response rate for the survey was 18.4%, with 618 usable responses from 3355 distributed questionnaires. Topics covered by the survey included ED conditions during the (H1N1) 2009 influenza pandemic; information received about Pandemic (H1N1) 2009 Influenza; pandemic plans; the impact of the pandemic on ED staff with respect to stress; illness prevention measures; support received from others in work role; staff and others’ illness during the pandemic; other factors causing ED staff to miss work during the pandemic; and vaccination against Pandemic (H1N1) 2009 Influenza. Both qualitative and quantitative data were collected and analysed. Results: The results obtained from Directors of Emergency Medicine quantifying the impact of the pandemic were too limited for interpretation. Data sourced from health departments and published sources demonstrated an increase in influenza-like illness (ILI) presentations of between one and a half and three times the normal level of presentations of ILIs. Directors of Emergency Medicine reported a reasonable level of preparation for the pandemic, with most reporting the use of pandemic plans that translated into relatively effective operational infection control responses. Directors reported a highly significant impact on EDs and their staff from the pandemic. Growth in demand and related ED congestion were highly significant factors causing distress within the departments. Most (64%) respondents established a ‘flu clinic’ either as part of Pandemic (H1N1) 2009 Influenza Outbreak in Australia: Impact on Emergency Departments. the ED operations or external to it. They did not note a significantly higher rate of sick leave than usual. Responses relating to the impact on staff were proportional to the size of the colleges. Most respondents felt strongly that Pandemic (H1N1) 2009 Influenza had a significant impact on demand in their ED, with most patients having low levels of clinical urgency. Most respondents felt that the pandemic had a negative impact on the care of other patients, and 94% revealed some increase in stress due to lack of space for patients, increased demand, and filling staff deficits. Levels of concern about themselves or their family members contracting the illness were less significant than expected. Nurses displayed significantly higher levels of stress overall, particularly in relation to skill-mix requirements, lack of supplies and equipment, and patient and patients’ family aggression. More than one-third of respondents became ill with an ILI. Whilst respondents themselves reported taking low levels of sick leave, respondents cited difficulties with replacing absent staff. Ranked from highest to lowest, respondents gained useful support from ED colleagues, ED administration, their hospital occupational health department, hospital administration, professional colleges, state health department, and their unions. Respondents were generally positive about the information they received overall; however, the volume of information was considered excessive and sometimes inconsistent. The media was criticised as scaremongering and sensationalist and as being the cause of many unnecessary presentations to EDs. Of concern to the investigators was that a large proportion (43%) of respondents did not know whether a pandemic plan existed for their department or hospital. A small number of staff reported being redeployed from their usual workplace for personal risk factors or operational reasons. As at the time of survey (29 October –18 December 2009), 26% of ED staff reported being vaccinated against Pandemic (H1N1) 2009 Influenza. Of those not vaccinated, half indicated they would ‘definitely’ or ‘probably’ not get vaccinated, with the main reasons being the vaccine was ‘rushed into production’, ‘not properly tested’, ‘came out too late’, or not needed due to prior infection or exposure, or due to the mildness of the disease. Conclusion: Pandemic (H1N1) 2009 Influenza had a significant impact on Australian Emergency Departments. The pandemic exposed problems in existing plans, particularly a lack of guidelines, general information overload, and confusion due to the lack of a single authoritative information source. Of concern was the high proportion of respondents who did not know if their hospital or department had a pandemic plan. Nationally, the pandemic communication strategy needs a detailed review, with more engagement with media networks to encourage responsible and consistent reporting. Also of concern was the low level of immunisation, and the low level of intention to accept vaccination. This is a problem seen in many previous studies relating to seasonal influenza and health care workers. The design of EDs needs to be addressed to better manage infectious patients. Significant workforce issues were confronted in this pandemic, including maintaining appropriate staffing levels; staff exposure to illness; access to, and appropriate use of, personal protective equipment (PPE); and the difficulties associated with working in PPE for prolonged periods. An administrative issue of note was the reporting requirement, which created considerable additional stress for staff within EDs. Peer and local support strategies helped ensure staff felt their needs were provided for, creating resilience, dependability, and stability in the ED workforce. Policies regarding the establishment of flu clinics need to be reviewed. The ability to create surge capacity within EDs by considering staffing, equipment, physical space, and stores is of primary importance for future pandemics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine if participants with normal visual acuity, no ophthalmoscopically signs of age-related maculopathy (ARM) in both eyes and who are carriers of the CFH, LOC387715 and HRTA1 high-risk genotypes (“gene-positive”) have impaired rod- and cone-mediated mesopic visual function compared to persons who do not carry the risk genotypes (“gene-negative”).---------- METHODS: Fifty-three Caucasian study participants (mean 55.8 ± 6.1) were genotyped for CFH, LOC387715/ARMS2 and HRTA1 polymorphisms. We genotyped single nucleotide polymorphisms (SNPs) in the CFH (rs380390), LOC387715/ARMS2 (rs10490924) and HTRA1 (rs11200638) genes using Applied Biosystems optimised TaqMan assays. We determined the critical fusion frequency (CFF) mediated by cones alone (Long, Middle and Short wavelength sensitive cones; LMS) and by the combined activities of cones and rods (LMSR). The stimuli were generated using a 4-primary photostimulator that provides independent control of the photoreceptor excitation under mesopic light levels. Visual function was further assessed using standard clinical tests, flicker perimetry and microperimetry.---------- RESULTS: The mesopic CFF mediated by rods and cones (LMSR) was significantly reduced in gene-positive compared to gene-negative participants after correction for age (p=0.03). Cone-mediated CFF (LMS) was not significantly different between gene-positive and -negative participants. There were no significant associations between flicker perimetry and microperimetry and genotype.---------- CONCLUSIONS: This is the first study to relate ARM risk genotypes with mesopic visual function in clinically normal persons. These preliminary results could become of clinical importance as mesopic vision may be used to document sub-clinical retinal changes in persons with risk genotypes and to determine whether those persons progress into manifest disease.