993 resultados para binary analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the far-field intensity distribution of binary phase gratings whose strips present certain randomness in their height. A statistical analysis based on the mutual coherence function is done in the plane just after the grating. Then, the mutual coherence function is propagated to the far field and the intensity distribution is obtained. Generally, the intensity of the diffraction orders decreases in comparison to that of the ideal perfect grating. Several important limit cases, such as low- and high-randomness perturbed gratings, are analyzed. In the high-randomness limit, the phase grating is equivalent to an amplitude grating plus a “halo.” Although these structures are not purely periodic, they behave approximately as a diffraction grating.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Primary hyperparathyroidism (PHPT) is a common endocrine neoplastic disorder caused by a failure of calcium sensing secondary to tumour development in one or more of the parathyroid glands. Parathyroid adenomas are comprised of distinct cellular subpopulations of variable clonal status that exhibit differing degrees of calcium responsiveness. To gain a clearer understanding of the relationship among cellular identity, tumour composition and clinical biochemistry in PHPT, we developed a novel single cell platform for quantitative evaluation of calcium sensing behaviour in freshly resected human parathyroid tumour cells. Live-cell intracellular calcium flux was visualized through Fluo-4-AM epifluorescence, followed by in situ immunofluorescence detection of the calcium sensing receptor (CASR), a central component in the extracellular calcium signalling pathway. The reactivity of individual parathyroid tumour cells to extracellular calcium stimulus was highly variable, with discrete kinetic response patterns observed both between and among parathyroid tumour samples. CASR abundance was not an obligate determinant of calcium responsiveness. Calcium EC50 values from a series of parathyroid adenomas revealed that the tumours segregated into two distinct categories. One group manifested a mean EC50 of 2.40 mM (95% CI: 2.37-2.41), closely aligned to the established normal range. The second group was less responsive to calcium stimulus, with a mean EC50 of 3.61 mM (95% CI: 3.45-3.95). This binary distribution indicates the existence of a previously unappreciated biochemical sub-classification of PHPT tumours, possibly reflecting distinct etiological mechanisms. Recognition of quantitative differences in calcium sensing could have important implications for the clinical management of PHPT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object-oriented design and object-oriented languages support the development of independent software components such as class libraries. When using such components, versioning becomes a key issue. While various ad-hoc techniques and coding idioms have been used to provide versioning, all of these techniques have deficiencies - ambiguity, the necessity of recompilation or re-coding, or the loss of binary compatibility of programs. Components from different software vendors are versioned at different times. Maintaining compatibility between versions must be consciously engineered. New technologies such as distributed objects further complicate libraries by requiring multiple implementations of a type simultaneously in a program. This paper describes a new C++ object model called the Shared Object Model for C++ users and a new implementation model called the Object Binary Interface for C++ implementors. These techniques provide a mechanism for allowing multiple implementations of an object in a program. Early analysis of this approach has shown it to have performance broadly comparable to conventional implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two direct sampling correlator-type receivers for differential chaos shift keying (DCSK) communication systems under frequency non-selective fading channels are proposed. These receivers operate based on the same hardware platform with different architectures. In the first scheme, namely sum-delay-sum (SDS) receiver, the sum of all samples in a chip period is correlated with its delayed version. The correlation value obtained in each bit period is then compared with a fixed threshold to decide the binary value of recovered bit at the output. On the other hand, the second scheme, namely delay-sum-sum (DSS) receiver, calculates the correlation value of all samples with its delayed version in a chip period. The sum of correlation values in each bit period is then compared with the threshold to recover the data. The conventional DCSK transmitter, frequency non-selective Rayleigh fading channel, and two proposed receivers are mathematically modelled in discrete-time domain. The authors evaluated the bit error rate performance of the receivers by means of both theoretical analysis and numerical simulation. The performance comparison shows that the two proposed receivers can perform well under the studied channel, where the performances get better when the number of paths increases and the DSS receiver outperforms the SDS one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study describes the design and characterisation of the rheological and mechanical properties of binary polymeric systems composed of 2-Hydroxypropylcellulose and ɩ-carrageenan, designed as ophthalmic viscoelastic devices (OVDs). Platforms were characterised using dilute solution, flow and oscillatory rheometry and texture profile analysis. Rheological synergy between the two polymers was observed both in the dilute and gel states. All platforms exhibited pseudoplastic flow. Increasing polymer concentrations significantly decreased the loss tangent and rate index yet increased the storage and loss moduli, consistency, gel hardness, compressibility and adhesiveness, the latter being related to the in-vivo retention properties of the platforms. Binary polymeric platforms exhibited unique physicochemical properties, properties that could not be engineered using mono-polymeric platforms. Using characterisation methods that provide information relevant to their clinical performance, low-cost binary platforms (3% hydroxypropylcellulose and either 1% or 2% ɩ-carrageenan) were identified that exhibited rheological, textural and viscoelastic properties advantageous for use as OVDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to determine if a high Tg polymer (Eudragit® S100) could be used to stabilize amorphous domains of polyethylene oxide (PEO) and hence improve the stability of binary polymer systems containing celecoxib (CX). We propose a novel method of stabilizing the amorphous PEO solid dispersion through inclusion of a miscible, high Tg polymer, namely, that can form strong inter-polymer interactions. The effects of inter-polymer interactions and miscibility between PEO and Eudragit S100 are considered. Polymer blends were first manufactured via hot-melt extrusion at different PEO/S100 ratios (70/30, 50/50, and 30/70 wt/wt). Differential scanning calorimetry and dynamic mechanical thermal analysis data suggested a good miscibility between PEO and S100 polymer blends, particularly at the 50/50 ratio. To further evaluate the system, CX/PEO/S100 ternary mixtures were extruded. Immediately after hot-melt extrusion, a single Tg that increased with increasing S100 content (anti-plasticization) was observed in all ternary systems. The absence of powder X-ray diffractometry crystalline Bragg’s peaks also suggested amorphization of CX. Upon storage (40°C/75% relative humidity), the formulation containing PEO/S100 at a ratio of 50:50 was shown to be most stable. Fourier transform infrared studies confirmed the presence of hydrogen bonding between Eudragit S100 and PEO suggesting this was the principle reason for stabilization of the amorphous CX/PEO solid dispersion system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unflattering representations of salesmanship in mass media exist in abundance. In order to gauge the depiction of selling in mass media, this article explores the nature and public perceptions of salesmanship using editorial cartoons. A theory of cartooning suggests that editorial cartoons reflect public sentiment toward events and issues and therefore provide a useful way of measuring and tracking such sentiment over time. The criteria of narrative, location, binary struggle, normative transference, and metaphor were used as a framework to analyze 286 cartoons over a 30-year period from 1983 to 2013. The results suggest that while representations of the characteristics and behaviors of salespeople shifted very little across time periods, changes in public perceptions of seller–buyer conflict, the role of the customer, and selling techniques were observed, thus indicating that cartoons are sensitive enough to measure the portrayal of selling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic objective of this work is to evaluate the durability of self-compacting concrete (SCC) produced in binary and ternary mixes using fly ash (FA) and limestone filler (LF) as partial replacement of cement. The main characteristics that set SCC apart from conventional concrete (fundamentally its fresh state behaviour) essentially depend on the greater or lesser content of various constituents, namely: greater mortar volume (more ultrafine material in the form of cement and mineral additions); proper control of the maximum size of the coarse aggregate; use of admixtures such as superplasticizers. Significant amounts of mineral additions are thus incorporated to partially replace cement, in order to improve the workability of the concrete. These mineral additions necessarily affect the concrete's microstructure and its durability. Therefore, notwithstanding the many well-documented and acknowledged advantages of SCC, a better understanding its behaviour is still required, in particular when its composition includes significant amounts of mineral additions. An ambitious working plan was devised: first, the SCC's microstructure was studied and characterized and afterwards the main transport and degradation mechanisms of the SCC produced were studied and characterized by means of SEM image analysis, chloride migration, electrical resistivity, and carbonation tests. It was then possible to draw conclusions about the SCC's durability. The properties studied are strongly affected by the type and content of the additions. Also, the use of ternary mixes proved to be extremely favourable, confirming the expected beneficial effect of the synergy between LF and FA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the split-field finite-difference time-domain method (SF-FDTD) has been extended for the analysis of two-dimensionally periodic structures with third-order nonlinear media. The accuracy of the method is verified by comparisons with the nonlinear Fourier Modal Method (FMM). Once the formalism has been validated, examples of one- and two-dimensional nonlinear gratings are analysed. Regarding the 2D case, the shifting in resonant waveguides is corroborated. Here, not only the scalar Kerr effect is considered, the tensorial nature of the third-order nonlinear susceptibility is also included. The consideration of nonlinear materials in this kind of devices permits to design tunable devices such as variable band filters. However, the third-order nonlinear susceptibility is usually small and high intensities are needed in order to trigger the nonlinear effect. Here, a one-dimensional CBG is analysed in both linear and nonlinear regime and the shifting of the resonance peaks in both TE and TM are achieved numerically. The application of a numerical method based on the finite- difference time-domain method permits to analyse this issue from the time domain, thus bistability curves are also computed by means of the numerical method. These curves show how the nonlinear effect modifies the properties of the structure as a function of variable input pump field. When taking the nonlinear behaviour into account, the estimation of the electric field components becomes more challenging. In this paper, we present a set of acceleration strategies based on parallel software and hardware solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Purpose—Vascular prevention trials mostly count “yes/no” (binary) outcome events, eg, stroke/no stroke. Analysis of ordered categorical vascular events (eg, fatal stroke/nonfatal stroke/no stroke) is clinically relevant and could be more powerful statistically. Although this is not a novel idea in the statistical community, ordinal outcomes have not been applied to stroke prevention trials in the past. Methods—Summary data on stroke, myocardial infarction, combined vascular events, and bleeding were obtained by treatment group from published vascular prevention trials. Data were analyzed using 10 statistical approaches which allow comparison of 2 ordinal or binary treatment groups. The results for each statistical test for each trial were then compared using Friedman 2-way analysis of variance with multiple comparison procedures. Results—Across 85 trials (335 305 subjects) the test results differed substantially so that approaches which used the ordinal nature of stroke events (fatal/nonfatal/no stroke) were more efficient than those which combined the data to form 2 groups (P0.0001). The most efficient tests were bootstrapping the difference in mean rank, Mann–Whitney U test, and ordinal logistic regression; 4- and 5-level data were more efficient still. Similar findings were obtained for myocardial infarction, combined vascular outcomes, and bleeding. The findings were consistent across different types, designs and sizes of trial, and for the different types of intervention. Conclusions—When analyzing vascular events from prevention trials, statistical tests which use ordered categorical data are more efficient and are more likely to yield reliable results than binary tests. This approach gives additional information on treatment effects by severity of event and will allow trials to be smaller. (Stroke. 2008;39:000-000.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aim: Maternal morbidity and mortality statistics remain unacceptably high in Malawi. Prominent among the risk factors in the country is anaemia in pregnancy, which generally results from nutritional inadequacy (particularly iron deficiency) and malaria, among other factors. This warrants concerted efforts to increase iron intake among reproductive-age women. This study, among women in Malawi, examined factors determining intake of supplemental iron for at least 90 days during pregnancy. Methods: A weighted sample of 10,750 women (46.7%), from the 23,020 respondents of the 2010 Malawi Demographic and Health Survey (MDHS), were utilized for the study. Univariate, bivariate, and regression techniques were employed. While univariate analysis revealed the percent distributions of all variables, bivariate analysis was used to examine the relationships between individual independent variables and adherence to iron supplementation. Chi-square tests of independence were conducted for categorical variables, with the significance level set at P < 0.05. Two binary logistic regression models were used to evaluate the net effect of independent variables on iron supplementation adherence. Results: Thirty-seven percent of the women adhered to the iron supplementation recommendations during pregnancy. Multivariate analysis indicated that younger age, urban residence, higher education, higher wealth status, and attending antenatal care during the first trimester were significantly associated with increased odds of taking iron supplementation for 90 days or more during pregnancy (P < 0.01). Conclusions: The results indicate low adherence to the World Health Organization’s iron supplementation recommendations among pregnant women in Malawi, and this contributes to negative health outcomes for both mothers and children. Focusing on education interventions that target populations with low rates of iron supplement intake, including campaigns to increase the number of women who attend antenatal care clinics in the first trimester, are recommended to increase adherence to iron supplementation recommendations.