862 resultados para Weights and measures, Arab.
Resumo:
K. Rasmani and Q. Shen. Modifying weighted fuzzy subsethood-based rule models with fuzzy quantifiers. Proceedings of the 13th International Conference on Fuzzy Systems, pages 1679-1684, 2004
Resumo:
Rubinstein, William, et al., The Jews in the Modern World: A History Since 1750 (London: Hodder and Arnold, 2002), pp.xiv+449 RAE2008
Resumo:
Celem niniejszego opracowania jest analiza założeń polityki ochrony cyberprzestrzeni RP zaprezentowanych w dokumencie zatytułowanym Polityka ochrony cyberprzestrzeni Rzeczypospolitej Polskiej, zaprezentowanym w 2013 r. przez Ministerstwo Administracji i Cyfryzacji i Agencję Bezpieczeństwa Wewnętrznego. Artykuł poddaje analizie postulaty i wytyczne tam zamieszczone, jak również konfrontuje te założenia z elementami systemu ochrony cyberprzestrzeni RP. Zgodzić należy się z twórcami tej strategii, iż zapewnienie stanu pełnego bezpieczeństwa teleinformatycznego, jest niemożliwe. Można mówić jedynie osiągnięciu pewnego, akceptowalnego jego poziomu. Wydaje się, że do osiągnięcia tego celu, powinna w znaczącym stopniu przyczynić się realizacja priorytetów polityki ochrony cyberprzestrzeni RP, a wśród nich w szczególności określenie kompetencji podmiotów odpowiedzialnych za bezpieczeństwo cyberprzestrzeni, stworzenie i realizacja spójnego dla wszystkich podmiotów administracji rządowej systemu zarządzania bezpieczeństwem cyberprzestrzeni oraz ustanowienie wytycznych w tym zakresie dla podmiotów niepublicznych, stworzenie trwałego systemu koordynacji i wymiany informacji pomiędzy podmiotami odpowiedzialnymi za bezpieczeństwo cyberprzestrzeni i użytkownikami cyberprzestrzeni, zwiększenie świadomości użytkowników cyberprzestrzeni w zakresie metod i środków bezpieczeństwa.
Resumo:
Many real world image analysis problems, such as face recognition and hand pose estimation, involve recognizing a large number of classes of objects or shapes. Large margin methods, such as AdaBoost and Support Vector Machines (SVMs), often provide competitive accuracy rates, but at the cost of evaluating a large number of binary classifiers, thus making it difficult to apply such methods when thousands or millions of classes need to be recognized. This thesis proposes a filter-and-refine framework, whereby, given a test pattern, a small number of candidate classes can be identified efficiently at the filter step, and computationally expensive large margin classifiers are used to evaluate these candidates at the refine step. Two different filtering methods are proposed, ClassMap and OVA-VS (One-vs.-All classification using Vector Search). ClassMap is an embedding-based method, works for both boosted classifiers and SVMs, and tends to map the patterns and their associated classes close to each other in a vector space. OVA-VS maps OVA classifiers and test patterns to vectors based on the weights and outputs of weak classifiers of the boosting scheme. At runtime, finding the strongest-responding OVA classifier becomes a classical vector search problem, where well-known methods can be used to gain efficiency. In our experiments, the proposed methods achieve significant speed-ups, in some cases up to two orders of magnitude, compared to exhaustive evaluation of all OVA classifiers. This was achieved in hand pose recognition and face recognition systems where the number of classes ranges from 535 to 48,600.
Resumo:
A new family of neural network architectures is presented. This family of architectures solves the problem of constructing and training minimal neural network classification expert systems by using switching theory. The primary insight that leads to the use of switching theory is that the problem of minimizing the number of rules and the number of IF statements (antecedents) per rule in a neural network expert system can be recast into the problem of minimizing the number of digital gates and the number of connections between digital gates in a Very Large Scale Integrated (VLSI) circuit. The rules that the neural network generates to perform a task are readily extractable from the network's weights and topology. Analysis and simulations on the Mushroom database illustrate the system's performance.
Resumo:
Introduction: There is accumulating evidence of an increased risk of cardiovascular morbidity and mortality in rheumatoid arthritis patients. A combination of both traditional cardiovascular risks and rheumatoid specific factors appear to be responsible for driving this phenomenon. Rheumatoid arthritis has been an orphan of cardiologists in the past and rheumatologists themselves are not good at CVD screening. Identifying the extent of preclinical atherosclerosis in RA patients will help us to appreciate the magnitude of this serious problem in an Irish population. Methods: We undertook a cross-sectional study of 63 RA patients and 48 OA controls and compared the 2 groups with respect to 1) traditional CV risks factors, 2) serum biomarkers of inflammation, including CRP, TNFα, IL6 and PAI-1, 3) carotid intima-media thickness (cIMT), carotid plaque and ankle-brachial index (ABI) as markers of pre-clinical atherosclerosis, 4) biochemical and ultrasonic measures of endothelial dysfunction and 5) serum and echocardiographic measures of diastolic dysfunction. Within the RA group, we also investigated for associations between markers of inflammation, subclinical atherosclerosis and diastolic dysfunction. Results: Prevalence of traditional CV risks was similar in the RA and OA groups. A number of biomarkers of inflammation were significantly higher in the RA group: CRP, fibrinogen, IL- 2, -4, -6, TNFα. PAI-1, a marker of thrombosis, correlated with disease activity and subclinical atherosclerosis in RA patients. With regard to subclinical atherosclerosis measures, RA patients had a significantly lower ABI than OA patients. Carotid plaque and cIMT readings were similar in RA and OA patients. Assessment of endothelial function revealed that RA patients had significantly higher concentrations of adhesion molecules, in particular sero-positive RA patients and RA smokers. Adhesion molecule concentrations were associated with markers of diastolic dysfunction in RA. Urine PCR, another marker of endothelial dysfunction also correlated with diastolic dysfunction in RA. Assessment of endothelial function with flow mediated dilatation (FMD) found no difference between the RA and OA groups. Disease activity scores in RA patients were associated with endothelial dysfunction, as assessed by FMD. Conclusions: We did not find significant differences in measures of subclinical atherosclerosis, flow mediated dilatation or diastolic function between RA and OA patients. This is most likely in part due to the fact that there is increasing evidence that OA has an inflammatory component to its pathogenesis and is associated with metabolic syndrome and increased CV risk. We reported a significant association between urinary PCR and measures of diastolic dysfunction. Urinary PCR may be a useful screening tool for diastolic dysfunction in RA. The association between RA disease activity and measures of vascular function supports the theory that the excess cardiovascular burden in RA is linked to uncontrolled inflammation.
Resumo:
We demonstrate that when the future path of the discount rate is uncertain and highly correlated, the distant future should be discounted at significantly lower rates than suggested by the current rate. We then use two centuries of US interest rate data to quantify this effect. Using both random walk and mean-reverting models, we compute the "certainty-equivalent rate" that summarizes the effect of uncertainty and measures the appropriate forward rate of discount in the future. Under the random walk model we find that the certainty-equivalent rate falls continuously from 4% to 2% after 100 years, 1% after 200 years, and 0.5% after 300 years. At horizons of 400 years, the discounted value increases by a factor of over 40,000 relative to conventional discounting. Applied to climate change mitigation, we find that incorporating discount rate uncertainty almost doubles the expected present value of mitigation benefits. © 2003 Elsevier Science (USA). All rights reserved.
Resumo:
The Perceived Health Competence Scale (PHCS) is a measure of self-efficacy regarding general healthrelated behaviour. This brief paper examines the psychometric properties of the PHCS in a UK context. Questionnaires containing the PHCS, the SF-36 and questions about perceived health needs were posted to 486 patients randomly selected from a GP practice list. Complete questionnaires were returned by 320 patients. Analyses of these responses provide strong evidence for the validity of the PHCS in this setting. Consequently, we conclude that the PHCS is a useful addition to measures of global self-efficacy and measures of self-efficacy regarding specific behaviours in the toolkit of health psychologists. This range of self-efficacy assessment tools will ensure that psychologists can match the level of specificity of the measure of expectancy beliefs to the level of specificity of the outcome of interest.
Resumo:
This study tested the psychometric properties of a questionnaire that measured sources of distress and eustress, or good stress, in nursing students. The Transactional model of stress construes stress in these different ways and is frequently used to understand sources of stress, coping and stress responses. Limited research has attempted to measure sources of distress and eustress or sources that can potentially enhance performance and well-being. A volunteer sample of final year nursing students (n = 120) was surveyed in the United Kingdom in 2007. The questionnaire measured sources of stress and measures of psychological well-being were taken to test construct validity. This was tested through an exploratory factor analysis. This reduced the questionnaire from 49 to 29 items and suggested three factors: learning and teaching, placement related and course organization; second, it was analysed by testing the assumptions of the Transactional model, the model on which the questionnaire was based. In line with the assumptions of the model, measures of distress related to adverse well-being, and measures of eustress related to healthier well-being responses. The test–retest reliability estimate was 0.8. While certain programme issues were associated with distress, placement-related experiences were the most important source of eustress.
Resumo:
Introduction: Centenarians are reservoirs of genetic and environmental information to successful ageing and local centenarian groups may help us to understand some of the factors that contribute to longevity. The current centenarian cohort in Belfast survived the 1970s epidemic of death from coronary heart disease in Northern Ireland, where cardiovascular mortality was almost highest in the world. These centenarians provided an opportunity to assess biological and genetic factors important in cardiovascular risk and ageing. Methods: Thirty-five (27 female, 8 male) centenarians, participants of the Belfast Elderly Longitudinal Free-living Ageing STudy (BELFAST), were community-living and of good cognition at enrolment. Results: Centenarians showed median Body Mass Index (BMI) at 25.7, systolic blood pressure 140mmHg and diastolic blood pressure 90mmHg, and fasting glucose of 5.54 mmol/l with no sex-related difference. Lipoproteins showed median cholesterol 5.3, High Density Lipoprotein (HDL) 1.10 and Low Density Lipoprotein (LDL) 3.47umol/l respectively. Centenarian smokers showed no different blood pressure or lipid measurements compared with non-smokers. Malondialdehyde, a measure of lipid peroxidation, was low at 1.19 umol/l, and measures of antioxidant status were varied. Male centenarians did not carry any of the vascular risk genotypes studied-ApoE4 for Apolipoprotein E (ApoE), DD for Angiotensinogen Converting Enzyme (ACE) and tt for 5,10-methylenetetrahydrofolate reductase (MTFHR), though this was not true for female centenarians.. Conclusions: This small local study shows that Belfast centenarians carry a reasonably favourable risk profile, except for age, with respect to cardiovascular disease. There is also some evidence that vascular risk factors and genotypes may be tolerated differently between the male and female centenarians. Maintaining a favourable cardiovascular risk profile seems likely to improve the chance of becoming a centenarian, especially for males.
Resumo:
Predicable and controlled degradation is not only central to the accurate delivery of bioactive agents and drugs, it also plays a vital role in key aspects of bone tissue engineering. The work addressed in this paper investigates the utilisation of e-beam irradiation in order to achieve a controlled (surface) degradation profile. This study focuses on the modification of commercially and clinically relevant materials, namely poly(L-lactic acid) (PLLA), poly(L-lactide-hydroxyapatite) (PLLA-HA), poly(L-lactide-glycolide) co-polymer (PLG) and poly(L-lactide-DL-lactide) co-polymer (PLDL). Samples were subjected to irradiation treatments using a 0.5 MeV electron beam with delivered surface doses of 150 and 500 kGy. In addition, an acrylic attenuation shield was used for selected samples to control the penetration of the e-beam. E-beam irradiation induced chain scission in all polymers, as characterized by reduced molecular weights and glass transition temperatures (T-g). Irradiation not only produced changes in the physical properties of the polymers but also had associated effects on surface erosion of the materials during hydrolytic degradation. Moreover, the extent to which both mechanical and hydrolytic degradation was observed is synonymous with the estimated penetration of the beam (as controlled by the employment of an attenuation shield). (C) 2010 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
Evidence for acquiescence (yea-saying) in interviews with people who have mental retardation is reviewed and the different ways it has been assessed are discussed. We argue that acquiescence is caused by many factors, each of which is detected differentially by these methods. Evidence on the likely causes of acquiescence is reviewed, and we suggest that although researchers often stress a desire to please or increased submissiveness as the must important factor, acquiescence should also be seen as a response to questions that are too complex, either grammatically or in the type of judgments they request. Strategies to reduce acquiescence in interviews are reviewed and measures that can be taken to increase the inclusiveness of interviews and self-report scales in this population suggested.
Resumo:
Obestatin (OB(1-23) is a 23 amino acid peptide encoded on the preproghrelin gene, originally reported to have metabolic actions related to food intake, gastric emptying and body weight. The biological instability of OB(1-23) has recently been highlighted by studies demonstrating its rapid enzymatic cleavage in a number of biological matrices. We assessed the stability of both OB(1-23) and an N-terminally PEGylated analogue (PEG-OB(1-23)) before conducting chronic in vivo studies. Peptides were incubated in rat liver homogenate and degradation monitored by LC-MS. PEG-OB(1-23) was approximately 3-times more stable than OB(1-23). Following a 14 day infusion of Sprague Dawley rats with 50 mol/kg/day of OB(1-23) or a N-terminally PEGylated analogue (PEG-OB(1-23)), we found no changes in food/fluid intake, body weight and plasma glucose or cholesterol between groups. Furthermore, morphometric liver, muscle and white adipose tissue (WAT) weights and tissue triglyceride concentrations remained unaltered between groups. However, with stabilised PEG-OB(1-23) we observed a 40% reduction in plasma triglycerides. These findings indicate that PEG-OB(1-23) is an OB(1-23) analogue with significantly enhanced stability and suggest that obestatin could play a role in modulating physiological lipid metabolism, although it does not appear to be involved in regulation of food/fluid intake, body weight or fat deposition.
Resumo:
The quick, easy way to master all the statistics you'll ever need The bad news first: if you want a psychology degree you'll need to know statistics. Now for the good news: Psychology Statistics For Dummies. Featuring jargon-free explanations, step-by-step instructions and dozens of real-life examples, Psychology Statistics For Dummies makes the knotty world of statistics a lot less baffling. Rather than padding the text with concepts and procedures irrelevant to the task, the authors focus only on the statistics psychology students need to know. As an alternative to typical, lead-heavy statistics texts or supplements to assigned course reading, this is one book psychology students won't want to be without. Ease into statistics – start out with an introduction to how statistics are used by psychologists, including the types of variables they use and how they measure them Get your feet wet – quickly learn the basics of descriptive statistics, such as central tendency and measures of dispersion, along with common ways of graphically depicting information Meet your new best friend – learn the ins and outs of SPSS, the most popular statistics software package among psychology students, including how to input, manipulate and analyse data Analyse this – get up to speed on statistical analysis core concepts, such as probability and inference, hypothesis testing, distributions, Z-scores and effect sizes Correlate that – get the lowdown on common procedures for defining relationships between variables, including linear regressions, associations between categorical data and more Analyse by inference – master key methods in inferential statistics, including techniques for analysing independent groups designs and repeated-measures research designs Open the book and find: Ways to describe statistical data How to use SPSS statistical software Probability theory and statistical inference Descriptive statistics basics How to test hypotheses Correlations and other relationships between variables Core concepts in statistical analysis for psychology Analysing research designs Learn to: Use SPSS to analyse data Master statistical methods and procedures using psychology-based explanations and examples Create better reports Identify key concepts and pass your course
Resumo:
Biodiversity is not a commodity, nor a service (ecosystem or otherwise), it is a scientific measure of the complexity of a biological system. Rather than directly valuing biodiversity, economists have tended to value its services, more often the services of 'key' species. This is understandable given the confusion of definitions and measures of biodiversity, but weakly justified if biodiversity is not substitutable. We provide a quantitative and comprehensive definition of biodiversity and propose a framework for examining its substitutability as the first step towards valuation. We define biodiversity as a measure of semiotic information. It is equated with biocomplexity and measured by Algorithmic Information Content (AIC). We argue that the potentially valuable component of this is functional information content (FIC) which determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. From this, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science.