891 resultados para infills, non-bearing wall, time histories, nonlinear dynamic analysis, pilotis, bare frame, infilled frame, DRAIN 2000.
Resumo:
In this paper, a model is presented that describes the pressure drop of gas-liquid Taylor flow in round capillaries with a channel diameter typically less than 1 mm. The analysis of Bretherton (J Fluid Mech 10:166-188, 1961) for the pressure drop over a single gas bubble for vanishing liquid film thickness is extended to include a non-negligible liquid film thickness using the analysis of Aussillous and Qu,r, (Phys Fluids 12(10):2367-2371, 2000). This result is combined with the Hagen-Poiseuille equation for liquid flow using a mass balance-based Taylor flow model previously developed by the authors (Warnier et al. in Chem Eng J 135S:S153-S158, 2007). The model presented in this paper includes the effect of the liquid slug length on the pressure drop similar to the model of Kreutzer et al. (AIChE J 51(9):2428-2440, 2005). Additionally, the gas bubble velocity is taken into account, thereby increasing the accuracy of the pressure drop predictions compared to those of the model of Kreutzer et al. Experimental data were obtained for nitrogen-water Taylor flow in a round glass channel with an inner diameter of 250 mu m. The capillary number Ca (gl) varied between 2.3 x 10(-3) and 8.8 x 10(-3) and the Reynolds number Re (gl) varied between 41 and 159. The presented model describes the experimental results with an accuracy of +/- 4% of the measured values.
Resumo:
Many different immunochemical platforms exist for the screening of naturally occurring contaminants in food from the low cost enzyme linked immunosorbent assays (ELISA) to the expensive instruments such as optical biosensors based on the phenomenon of surface plasmon resonance (SPR). The primary aim of this study was to evaluate and compare a number of these platforms to assess their accuracy and precision when applied to naturally contaminated samples containing HT-2/T-2 mycotoxins. Other important factors considered were the speed of analysis, ease of use (sample preparation techniques and use of the equipment) and ultimately the cost implications. The three screening procedures compared included an SPR biosensor assay, a commercially available ELISA and an enzyme-linked immunomagnetic electrochemical array (ELIME array). The qualitative data for all methods demonstrated very good overall agreements with each other, however on comparison with mass spectrometry confirmatory results, the ELISA and SPR assay performed slightly better than the ELIME array, exhibiting an overall agreement of 95.8% compared to 91.7%. Currently, SPR is more costly than the other two platforms and can only be used in the laboratory whereas in theory both the ELISA and ELIME array are portable and can be used in the field, but ultimately this is dependent on the sample preparation techniques employed. Sample preparative techniques varied for all methods evaluated, the ELISA was the most simple to perform followed by that of the SPR method. The ELIME array involved an additional clean-up step thereby increasing both the time and cost of analysis. Therefore in the current format, field use would not be an option for the ELIME array. In relation to speed of analysis, the ELISA outperformed the other methods.
Resumo:
As a diagnostic of high-intensity laser interactions (> 10(19) W cm(-2)), the detection of radioactive isotopes is regularly used for the characterization of proton, neutron, ion, and photon beams. This involves sample removal from the interaction chamber and time consuming post shot analysis using NaI coincidence counting or Ge detectors. This letter describes the use of in situ detectors to measure laser-driven (p,n) reactions in Al-27 as an almost real-time diagnostic for proton acceleration. The produced Si-27 isotope decays with a 4.16 s half-life by the predominantly beta+ emission, producing a strong 511 keV annihilation peak. (c) 2006 American Institute of Physics.
Resumo:
Since the late 1980s, there has been a significant and progressive movement away from the traditional Public Administration (PA) systems, in favour of NPM-type accounting tools and ideas inspired by the private sector. More recently, a new focus on governance systems, under the banner Public Governance (PG), has emerged. In this paper it is argued that reforms are not isolated events, but are embedded in more global discourses of modernisation and influenced by the institutional pressures present in a certain field at certain points in time. Using extensive document analysis in three countries with different administrative regimes (the UK, Italy and Austria), we examine public sector accounting and budgeting reforms and the underlying discourses put forward in order to support the change. We investigate the extent to which the actual content of the reforms and the discourses they are embedded within are connected over time; that is, whether, and to what degree, the reform “talk” matches the “decisions”. The research shows that in both the UK and in Italy there is consistency between the debates and the decided changes, although the dominant discourse in each country differs, while in Austria changes are decided gradually, and only after they have been announced well in advance in the political debate. We find that in all three countries the new ideas and concepts layer and sediment above the existing ones, rather than replace them. Although all three countries underwent similar accounting and budgeting reforms and relied on similar institutional discourses, each made its own specific translation of the ideas and concepts and is characterised by a specific formation of sedimentations. In addition, the findings suggest that, at present in the three countries, the PG discourse is used to supplement, rather than supplant, other prevailing discourses.
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters, or text strings. A key issue with N-gram analysis is feature selection amidst the explosion of features that occurs when N is increased. The experiments within this paper represent programs as operational code (opcode) density histograms gained through dynamic analysis. A support vector machine is used to create a reference model, which is used to evaluate two methods of feature reduction, which are 'area of intersect' and 'subspace analysis using eigenvectors.' The findings show that the relationships between features are complex and simple statistics filtering approaches do not provide a viable approach. However, eigenvector subspace analysis produces a suitable filter.
Resumo:
In 1997 the Irish government adopted the National Anti-Poverty Strategy (NAPS), a global target for the reduction of poverty which illuminates a range of issues relating to official poverty targets. The Irish target is framed in terms of a relative poverty measure incorporating both relative income and direct measures of deprivation based on data on the extent of poverty from 1994. Since 1994 Ireland has experienced an unprecedented period of economic growth that makes it particularly important to assess whether the target has been achieved, but in doing so we cannot avoid asking some underlying questions about how poverty should be measured and monitored over time. After briefly outlining the nature of the NAPS measure, this article examines trends in poverty in Ireland between 1987 and 1997, Results show that the relative income and deprivation components of the NAPS measure reveal differential trends with increasing relative income poverty, but decreasing deprivation. However, this differential could be due to the fact that the direct measures of deprivation upon which NAPS is based have not been updated to take account of changes in real living standards and increasing expectations. To test whether this is so, we examine the extent to which expectations about living standards and the structure of deprivation have changed over time using confirmatory factor analysis and tests of criterion validity using different definitions of deprivation. Results show that the combined income and deprivation measure, as originally constituted, continues to identify a set of households experiencing generalised deprivation resulting from a lack of resources.
Resumo:
Potentially toxic elements (PTEs) including nickel and chromium are often present in soils overlying basalt at concentrations above regulatory guidance values due to the presence of these elements in underlying geology. Oral bioaccessibility testing allows the risk posed by PTEs to human health to be assessed; however, bioaccessibility is controlled by factors including mineralogy, particle size, solid-phase speciation and encapsulation. X-ray diffraction was used to characterise the mineralogy of 12 soil samples overlying Palaeogene basalt lavas in Northern Ireland, and non-specific sequential extraction coupled with chemometric analysis was used to determine the distribution of elements amongst soil components in 3 of these samples. The data obtained were related to total concentration and oral bioaccessible concentration to determine whether a relationship exists between the overall concentrations of PTEs, their bioaccessibility and the soils mineralogy and geochemistry. Gastric phase bioaccessible fraction (BAF %) ranged from 0.4 to 5.4 % for chromium in soils overlying basalt and bioaccessible and total chromium concentrations are positively correlated. In contrast, the range of gastric phase BAF for nickel was greater (1.4–43.8 %), while no significant correlation was observed between bioaccessible and total nickel concentrations. However, nickel BAF was inversely correlated with total concentration. Solid-phase fractionation information showed that bioaccessible nickel was associated with calcium carbonate, aluminium oxide, iron oxide and clay-related components, while bioaccessible chromium was associated with clay-related components. This suggests that weathering significantly affects nickel bioaccessibility, but does not have the same effect on the bioaccessibility of chromium.
Resumo:
Background: Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor blockers (ARBs) are commonly prescribed to the growing number of cancer patients (more than two million in the UK alone) often to treat hypertension. However, increased fatal cancer in ARB users in a randomized trial and increased breast cancer recurrence rates in ACEI users in a recent observational study have raised concerns about their safety in cancer patients. We investigated whether ACEI or ARB use after breast, colorectal or prostate cancer diagnosis was associated with increased risk of cancer-specific mortality.
Methods: Population-based cohorts of 9,814 breast, 4,762 colorectal and 6,339 prostate cancer patients newly diagnosed from 1998 to 2006 were identified in the UK Clinical Practice Research Datalink and confirmed by cancer registry linkage. Cancer-specific and all-cause mortality were identified from Office of National Statistics mortality data in 2011 (allowing up to 13 years of follow-up). A nested case–control analysis was conducted to compare ACEI/ARB use (from general practitioner prescription records) in cancer patients dying from cancer with up to five controls (not dying from cancer). Conditional logistic regression estimated the risk of cancer-specific, and all-cause, death in ACEI/ARB users compared with non-users.
Results: The main analysis included 1,435 breast, 1,511 colorectal and 1,184 prostate cancer-specific deaths (and 7,106 breast, 7,291 colorectal and 5,849 prostate cancer controls). There was no increase in cancer-specific mortality in patients using ARBs after diagnosis of breast (adjusted odds ratio (OR) = 1.06 95% confidence interval (CI) 0.84, 1.35), colorectal (adjusted OR = 0.82 95% CI 0.64, 1.07) or prostate cancer (adjusted OR = 0.79 95% CI 0.61, 1.03). There was also no evidence of increases in cancer-specific mortality with ACEI use for breast (adjusted OR = 1.06 95% CI 0.89, 1.27), colorectal (adjusted OR = 0.78 95% CI 0.66, 0.92) or prostate cancer (adjusted OR = 0.78 95% CI 0.66, 0.92).
Conclusions: Overall, we found no evidence of increased risks of cancer-specific mortality in breast, colorectal or prostate cancer patients who used ACEI or ARBs after diagnosis. These results provide some reassurance that these medications are safe in patients diagnosed with these cancers.
Keywords: Colorectal cancer; Breast cancer; Prostate cancer; Mortality; Angiotensin-converting enzyme inhibitors and angiotensin II receptor blockers
Resumo:
A significant number of proteins in both eukaryotes and prokaryotes are known to be post-translationally modified by the addition of phosphate, serving as a means of rapidly regulating protein function. Phosphorylation of the amino acids serine, threonine and tyrosine are the focus of the vast majority of studies aimed at elucidating the extent and roles of such modification, yet other amino acids, including histidine and aspartate, are also phosphorylated. Although histidine phosphorylation is known to play extensive roles in signalling in eukaryotes, plants and fungi, roles for phosphohistidine are poorly defined in higher eukaryotes. Characterization of histidine phosphorylation aimed at elucidating such information is problematic due to the acid-labile nature of the phosphoramidate bond, essential for many of its biological functions. Although MSbased strategies have proven extremely useful in the analysis of other types of phosphorylated peptides, the chromatographic procedures essential for such approaches promote rapid hydrolysis of phosphohistidinecontaining peptides. Phosphate transfer to non-biologically relevant aspartate residues during MS analysis further complicates the scenario. © 2013 Biochemical Society.
Resumo:
Obesity has been posited as an independent risk factor for diabetic kidney disease (DKD), but establishing causality from observational data is problematic. We aimed to test whether obesity is causally related to DKD using Mendelian randomization, which exploits the random assortment of genes during meiosis. In 6,049 subjects with type 1 diabetes, we used a weighted genetic risk score (GRS) comprised of 32 validated BMI loci as an instrument to test the relationship of BMI with macroalbuminuria, end-stage renal disease (ESRD), or DKD defined as presence of macroalbuminuria or ESRD. We compared these results with cross-sectional and longitudinal observational associations. Longitudinal analysis demonstrated a U-shaped relationship of BMI with development of macroalbuminuria, ESRD, or DKD over time. Cross-sectional observational analysis showed no association with overall DKD, higher odds of macroalbuminuria (for every 1 kg/m(2) higher BMI, odds ratio [OR] 1.05, 95% CI 1.03-1.07, P < 0.001), and lower odds of ESRD (OR 0.95, 95% CI 0.93-0.97, P < 0.001). Mendelian randomization analysis showed a 1 kg/m(2) higher BMI conferring an increased risk in macroalbuminuria (OR 1.28, 95% CI 1.11-1.45, P = 0.001), ESRD (OR 1.43, 95% CI 1.20-1.72, P < 0.001), and DKD (OR 1.33, 95% CI 1.17-1.51, P < 0.001). Our results provide genetic evidence for a causal link between obesity and DKD in type 1 diabetes. As obesity prevalence rises, this finding predicts an increase in DKD prevalence unless intervention should occur.
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. The motivation for this research is to find a subset of Ngram features that makes a robust indicator of malware. The experiments within this paper represent programs as N-gram density histograms, gained through dynamic analysis. A Support Vector Machine (SVM) is used as the program classifier to determine the ability of N-grams to correctly determine the presence of malicious software. The preliminary findings show that an N-gram size N=3 and N=4 present the best avenues for further analysis.
Resumo:
This paper presents a study on concrete fracture and the associated mesh sensitivity using the finite element (FE) method with a local concrete model in both tension (Mode I) and compression.To enable the incorporation of dynamic loading, the FE model is developed using a transient dynamic analysis code LS-DYNA Explicit.A series of investigations have been conducted on typical fracture scenarios to evaluate the model performances and calibration of relevant parameters.The K&C damage model was adopted because it is a comprehensive local concrete model which allows the user to change the crack band width, fracture energy and rate dependency of the material.Compressive localisation modelling in numerical modelling is also discussed in detail in relation to localisation.An impact test specimen is modelled.
Resumo:
As a newly invented parallel kinematic machine (PKM), Exechon has attracted intensive attention from both academic and industrial fields due to its conceptual high performance. Nevertheless, the dynamic behaviors of Exechon PKM have not been thoroughly investigated because of its structural and kinematic complexities. To identify the dynamic characteristics of Exechon PKM, an elastodynamic model is proposed with the substructure synthesis technique in this paper. The Exechon PKM is divided into a moving platform subsystem, a fixed base subsystem and three limb subsystems according to its structural features. Differential equations of motion for the limb subsystem are derived through finite element (FE) formulations by modeling the complex limb structure as a spatial beam with corresponding geometric cross sections. Meanwhile, revolute, universal, and spherical joints are simplified into virtual lumped springs associated with equivalent stiffnesses and mass at their geometric centers. Differential equations of motion for the moving platform are derived with Newton's second law after treating the platform as a rigid body due to its comparatively high rigidity. After introducing the deformation compatibility conditions between the platform and the limbs, governing differential equations of motion for Exechon PKM are derived. The solution to characteristic equations leads to natural frequencies and corresponding modal shapes of the PKM at any typical configuration. In order to predict the dynamic behaviors in a quick manner, an algorithm is proposed to numerically compute the distributions of natural frequencies throughout the workspace. Simulation results reveal that the lower natural frequencies are strongly position-dependent and distributed axial-symmetrically due to the structure symmetry of the limbs. At the last stage, a parametric analysis is carried out to identify the effects of structural, dimensional, and stiffness parameters on the system's dynamic characteristics with the purpose of providing useful information for optimal design and performance improvement of the Exechon PKM. The elastodynamic modeling methodology and dynamic analysis procedure can be well extended to other overconstrained PKMs with minor modifications.
Resumo:
Over the past few decades, the early medieval Easter controversy has increasingly been portrayed as a conflict between the ‘Celtic’ and the ‘Roman’ churches, limiting the geographical extent of this most vibrant debate to Britain and Ireland (with the exception of the disputes caused by Columbanus’ appearance on the Continent). Both are not the case. Before c.AD 800, there was no unanimity within the ‘Roman’ cause. Two ‘Roman’ Easter reckonings existed, which could not be reconciled, one invented by Victorius of Aquitaine in AD 457, the other being the Alexandrian system as translated into Latin by Dionysius Exiguus in AD 525. The conflict between followers of Victorius and adherents of Dionysius occurred in Visigothic Spain first, reached Ireland in the second half of the 7th century, and finally dominated the intellectual debate in Francia in the 8th century. This article will focus on the Irish dimension of this controversy. It is argued that the southern Irish clergy introduced the Victorian reckoning in the AD 630s and strictly adhered to that system until the end of the 7th century. When Adomnan, the abbot of Iona, converted to Dionysius in the late AD 680s and convinced most of the northern Irish churches to follow his example, this caused considerable tension with southern Irish followers of Victorius, as is impressively witnessed by the computistical literature of the time, especially the texts produced in AD 689. From this literature, the issues debated at the time are reconstructed. This analysis has serious consequences for how we read Irish history towards the end of the 7th century; rather than bringing the formerly ‘Celtic’ northern Irish clergy in line with southern Irish ‘Roman’ practise, Adomnan added a new dimension to the conflict.
Resumo:
The research presented, investigates the optimal set of operational codes (opcodes) that create a robust indicator of malicious software (malware) and also determines a program’s execution duration for accurate classification of benign and malicious software. The features extracted from the dataset are opcode density histograms, extracted during the program execution. The classifier used is a support vector machine and is configured to select those features to produce the optimal classification of malware over different program run lengths. The findings demonstrate that malware can be detected using dynamic analysis with relatively few opcodes.