919 resultados para Class analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new stochastic clustering methodology devised for the analysis of categorized or sorted data. The methodology reveals consumers' common category knowledge as well as individual differences in using this knowledge for classifying brands in a designated product class. A small study involving the categorization of 28 brands of U.S. automobiles is presented where the results of the proposed methodology are compared with those obtained from KMEANS clustering. Finally, directions for future research are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Digital forensics is a rapidly expanding field, due to the continuing advances in computer technology and increases in data stage capabilities of devices. However, the tools supporting digital forensics investigations have not kept pace with this evolution, often leaving the investigator to analyse large volumes of textual data and rely heavily on their own intuition and experience. Aim: This research proposes that given the ability of information visualisation to provide an end user with an intuitive way to rapidly analyse large volumes of complex data, such approached could be applied to digital forensics datasets. Such methods will be investigated; supported by a review of literature regarding the use of such techniques in other fields. The hypothesis of this research body is that by utilising exploratory information visualisation techniques in the form of a tool to support digital forensic investigations, gains in investigative effectiveness can be realised. Method:To test the hypothesis, this research examines three different case studies which look at different forms of information visualisation and their implementation with a digital forensic dataset. Two of these case studies take the form of prototype tools developed by the researcher, and one case study utilises a tool created by a third party research group. A pilot study by the researcher is conducted on these cases, with the strengths and weaknesses of each being drawn into the next case study. The culmination of these case studies is a prototype tool which was developed to resemble a timeline visualisation of the user behaviour on a device. This tool was subjected to an experiment involving a class of university digital forensics students who were given a number of questions about a synthetic digital forensic dataset. Approximately half were given the prototype tool, named Insight, to use, and the others given a common open-source tool. The assessed metrics included: how long the participants took to complete all tasks, how accurate their answers to the tasks were, and how easy the participants found the tasks to complete. They were also asked for their feedback at multiple points throughout the task. Results:The results showed that there was a statistically significant increase in accuracy for one of the six tasks for the participants using the Insight prototype tool. Participants also found completing two of the six tasks significantly easier when using the prototype tool. There were no statistically significant different difference between the completion times of both participant groups. There were no statistically significant differences in the accuracy of participant answers for five of the six tasks. Conclusions: The results from this body of research show that there is evidence to suggest that there is the potential for gains in investigative effectiveness when information visualisation techniques are applied to a digital forensic dataset. Specifically, in some scenarios, the investigator can draw conclusions which are more accurate than those drawn when using primarily textual tools. There is also evidence so suggest that the investigators found these conclusions to be reached significantly more easily when using a tool with a visual format. None of the scenarios led to the investigators being at a significant disadvantage in terms of accuracy or usability when using the prototype visual tool over the textual tool. It is noted that this research did not show that the use of information visualisation techniques leads to any statistically significant difference in the time taken to complete a digital forensics investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbonic anhydrases are enzymes that are ubiquitously found in all organisms that are engaged in catalyzing the hydration of carbon dioxide to form bicarbonate and proton and vice versa. They are crucial in the process of respiration, bone resorption, pH regulation, ion transport, and photosynthesis in plants. Out of the five classes of carbonic anhydrase α, β, γ, δ, ζ this study focused in the α carbonic anhydrases. This class of CAs constitute of 16 subfamilies in mammals that include 3 non-active enzymes known as Carbonic Anhydrase Related Proteins. The inactiveness of these enzymes is due to the loss of one or more Histidine residues in the active site. This thesis was conducted based on the aim of studying evolutionary analysis of carbonic anhydrase sequences from organisms spanning from the Cambrian age. It was carried out in two phases. The first phase was the sequence collection, which involved many biological sequence databases as a source. The scope of this segment included sequence alignments and analysis of the sequence manually and in an automated form incorporating few analysis tools. The second Phase was phylogenetic analysis and exploring the subcellular location of the proteins, which was key for the evolutionary analysis. Through the medium of the methods conducted with respect to the phases mentioned above, it was possible to accomplish the desired result. Certain thought-provoking sequences were come across and analyzed thoroughly. Whereas, Phylogenetics showed interesting results to bolster previous findings and new findings as well which lay bedrock for future intensified studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the conceptualization and use of a virtual classroom in the course EIF-200 Fundamentos de Informática, first course in the Information Systems Engineering career of the Universidad Nacional of Costa Rica. The virtual classroom is seen as a complement to the class and is conceived as a space that allows to centralize teaching resources, thereby promoting the  democratization of knowledge among students and teachers. Furthermore, this concept of virtual classroom helps to reduce the culture of individualism, present many times in university teaching practices, and contributes to create new opportunities to learn from other colleagues within a culture of reflection, analysis and respectful dialogue aimed to improve the teaching practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New psychoactive substances (NPSs) have appeared on the recreational drug market at an unprecedented rate in recent years. Many are not new drugs but failed products of the pharmaceutical industry. The speed and variety of drugs entering the market poses a new complex challenge for the forensic toxicology community. The detection of these substances in biological matrices can be difficult as the exact compounds of interest may not be known. Many NPS are sold under the same brand name and therefore users themselves may not know what substances they have ingested. The majority of analytical methods for the detection of NPSs tend to focus on a specific class of compounds rather than a wide variety. In response to this, a robust and sensitive method was developed for the analysis of various NPS by solid phase extraction (SPE) with gas chromatography mass spectrometry (GCMS). Sample preparation and derivatisation were optimised testing a range of SPE cartridges and derivatising agents, as well as derivatisation incubation time and temperature. The final gas chromatography mass spectrometry method was validated in accordance with SWGTOX 2013 guidelines over a wide concentration range for both blood and urine for 23 and 25 analytes respectively. This included the validation of 8 NBOMe compounds in blood and 10 NBOMe compounds in urine. This GC-MS method was then applied to 8 authentic samples with concentrations compared to those originally identified by NMS laboratories. The rapid influx of NPSs has resulted in the re-analysis of samples and thus, the stability of these substances is crucial information. The stability of mephedrone was investigated, examining the effect that storage temperatures and preservatives had on analyte stability daily for 1 week and then weekly for 10 weeks. Several laboratories identified NPSs use through the cross-reactivity of these substances with existing screening protocols such as ELISA. The application of Immunalysis ketamine, methamphetamine and amphetamine ELISA kits for the detection of NPS was evaluated. The aim of this work was to determine if any cross-reactivity from NPS substances was observed, and to determine whether these existing kits would identify NPS use within biological samples. The cross- reactivity of methoxetamine, 3-MeO-PCE and 3-MeO-PCP for different commercially point of care test (POCT) was also assessed for urine. One of the newest groups of compounds to appear on the NPS market is the NBOMe series. These drugs pose a serious threat to public health due to their high potency, with fatalities already reported in the literature. These compounds are falsely marketed as LSD which increases the chance of adverse effects due to the potency differences between these 2 substances. A liquid chromatography tandem mass spectrometry (LC-MS/MS) method was validated in accordance with SWGTOX 2013 guidelines for the detection for 25B, 25C and 25I-NBOMe in urine and hair. Long-Evans rats were administered 25B-, 25C- and 25I-NBOMe at doses ranging from 30-300 µg/kg over a period of 10 days. Tail flick tests were then carried out on the rats in order to determine whether any analgesic effects were observed as a result of dosing. Rats were also shaved prior to their first dose and reshaved after the 10-day period. Hair was separated by colour (black and white) and analysed using the validated LC-MS/MS method, assessing the impact hair colour has on the incorporation of these drugs. Urine was collected from the rats, analysed using the validated LC-MS/MS method and screened for potential metabolites using both LC-MS/MS and quadrupole time of flight (QToF) instrumentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop the a-posteriori error analysis of hp-version interior-penalty discontinuous Galerkin finite element methods for a class of second-order quasilinear elliptic partial differential equations. Computable upper and lower bounds on the error are derived in terms of a natural (mesh-dependent) energy norm. The bounds are explicit in the local mesh size and the local degree of the approximating polynomial. The performance of the proposed estimators within an automatic hp-adaptive refinement procedure is studied through numerical experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last decades of the 20th century defined the genetic engineering advent, climaxing in the development of techniques, such as PCR and Sanger sequencing. This, permitted the appearance of new techniques to sequencing whole genomes, identified as next-generation sequencing. One of the many applications of these techniques is the in silico search for new secondary metabolites, synthesized by microorganisms exhibiting antimicrobial properties. The peptide antibiotics compounds can be classified in two classes, according to their biosynthesis, in ribosomal or nonribosomal peptides. Lanthipeptides are the most studied ribosomal peptides and are characterized by the presence of lanthionine and methylanthionine that result from posttranslational modifications. Lanthipeptides are divided in four classes, depending on their biosynthetic machinery. In class I, a LanB enzyme dehydrate serine and threonine residues in the C-terminus precursor peptide. Then, these residues undergo a cyclization step performed by a LanC enzyme, forming the lanthionine rings. The cleavage and the transport of the peptide is achieved by the LanP and LanT enzymes, respectively. Although, in class II only one enzyme, LanM, is responsible for the dehydration and cyclization steps and also only one enzyme performs the cleavage and transport, LanT. Pedobacter sp. NL19 is a Gram-negative bacterium, isolated from sludge of an abandon uranium mine, in Viseu (Portugal). Antibacterial activity in vitro was detected against several Gram-positive and Gram-negative bacteria. Sequencing and in silico analysis of NL19 genome revealed the presence of 21 biosynthetic clusters for secondary metabolites, including nonribosomal and ribosomal peptides biosynthetic clusters. Four lanthipeptides clusters were predicted, comprising the precursor peptides, the modifying enzymes (LanB and LanC), and also a bifunctional LanT. This result revealed the hybrid nature of the clusters, comprising characteristics from two distinct classes, which are poorly described in literature. The phylogenetic analysis of their enzymes showed that they clustered within the bacteroidetes clade. Furthermore, hybrid gene clusters were also found in other species of this phylum, revealing that it is a common characteristic in this group. Finally, the analysis of NL19 colonies by MALDI-TOF MS allowed the identification of a 3180 Da mass that corresponds to the predicted mass of a lanthipeptide encoded in one of the clusters. However, this result is not fully conclusive and further experiments are needed to understand the full potential of the compounds encoded in this type of clusters. In conclusion, it was determined that NL19 strain has the potential to produce diverse secondary metabolites, including lanthipeptides that were not functionally characterized so far.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many different designs for audio amplifiers. Class-D, or switching, amplifiers generate their output signal in the form of a high-frequency square wave of variable duty cycle (ratio of on time to off time). The square-wave nature of the output allows a particularly efficient output stage, with minimal losses. The output is ultimately filtered to remove components of the spectrum above the audio range. Mathematical models are derived here for a variety of related class-D amplifier designs that use negative feedback. These models use an asymptotic expansion in powers of a small parameter related to the ratio of typical audio frequencies to the switching frequency to develop a power series for the output component in the audio spectrum. These models confirm that there is a form of distortion intrinsic to such amplifier designs. The models also explain why two approaches used commercially succeed in largely eliminating this distortion; a new means of overcoming the intrinsic distortion is revealed by the analysis. Copyright (2006) Society for Industrial and Applied Mathematics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human operators are unique in their decision making capability, judgment and nondeterminism. Their sense of judgment, unpredictable decision procedures, susceptibility to environmental elements can cause them to erroneously execute a given task description to operate a computer system. Usually, a computer system is protected against some erroneous human behaviors by having necessary safeguard mechanisms in place. But some erroneous human operator behaviors can lead to severe or even fatal consequences especially in safety critical systems. A generalized methodology that can allow modeling and analyzing the interactions between computer systems and human operators where the operators are allowed to deviate from their prescribed behaviors will provide a formal understanding of the robustness of a computer system against possible aberrant behaviors by its human operators. We provide several methodology for assisting in modeling and analyzing human behaviors exhibited while operating computer systems. Every human operator is usually given a specific recommended set of guidelines for operating a system. We first present process algebraic methodology for modeling and verifying recommended human task execution behavior. We present how one can perform runtime monitoring of a computer system being operated by a human operator for checking violation of temporal safety properties. We consider the concept of a protection envelope giving a wider class of behaviors than those strictly prescribed by a human task that can be tolerated by a system. We then provide a framework for determining whether a computer system can maintain its guarantees if the human operators operate within their protection envelopes. This framework also helps to determine the robustness of the computer system under weakening of the protection envelopes. In this regard, we present a tool called Tutela that assists in implementing the framework. We then examine the ability of a system to remain safe under broad classes of variations of the prescribed human task. We develop a framework for addressing two issues. The first issue is: given a human task specification and a protection envelope, will the protection envelope properties still hold under standard erroneous executions of that task by the human operators? In other words how robust is the protection envelope? The second issue is: in the absence of a protection envelope, can we approximate a protection envelope encompassing those standard erroneous human behaviors that can be safely endured by the system? We present an extension of Tutela that implements this framework. The two frameworks mentioned above use Concurrent Game Structures (CGS) as models for both computer systems and their human operators. However, there are some shortcomings of this formalism for our uses. We add incomplete information concepts in CGSs to achieve better modularity for the players. We introduce nondeterminism in both the transition system and strategies of players and in the modeling of human operators and computer systems. Nondeterministic action strategies for players in \emph{i}ncomplete information \emph{N}ondeterministic CGS (iNCGS) is a more precise formalism for modeling human behaviors exhibited while operating a computer system. We show how we can reason about a human behavior satisfying a guarantee by providing a semantics of Alternating Time Temporal Logic based on iNCGS player strategies. In a nutshell this dissertation provides formal methodology for modeling and analyzing system robustness against both expected and erroneous human operator behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The folding and targeting of membrane proteins poses a major challenge to the cell, as they must remain insertion competent while their highly hydrophobic transmembrane (TM) domains are transferred from the ribosome, through the aqueous cytosol and into the lipid bilayer. The biogenesis of a mature membrane protein takes place through the insertion and integration into the lipid bilayer. A number of TM proteins have been shown to gain some degree of secondary structure within the ribosome tunnel and to retain this conformation throughout maturation. Although studies into the folding and targeting of a number of membrane proteins have been carried out to date, there is little information on one of the largest class of eukaryotic membrane proteins; the G-protein-coupled receptors (GPCRs). This project studies the early folding events of the human ortholog of GPR35. To analyse the structure of the 1st TM domain, intermediates were generated and assessed by the biochemical method of pegylation (PEG-MAL). A structurally-similar microbial opsin (Bacterioopsin) was also used to investigate the differences in the early protein folding within eukaryotic and prokaryotic translation systems. Results showed that neither the 1st TM domain of GPR35 nor Bacterioopsin were capable of compacting in the ribosome tunnel before their N-terminus reached the ribosome exit point. The results for this assay remained consistent whether the proteins were translated in a eukaryotic or prokaryotic translation system. To examine the communication mechanism between the ribosome, the nascent chain and the protein targeting pathway, crosslinking experiments were carried out using the homobifunctional lysine cross-linker BS3. Specifically, the data generated here show that the nascent chain of GPR35 reaches the ribosomal protein uL23 in an extended conformation and interacts with the SRP protein as it exits the ribosome tunnel. This confirms the role of SRP in the co-translational targeting of GPR35. Using these methods insights into the early folding of GPCRs has been obtained. Further experiments using site-directed mutagenesis to reduce hydrophobicity in the 1st TM domain of GPR35, highlighted the mechanisms by which GPCRs are targeted to the endoplasmic reticulum. Confirming that hydrophobicity within the signal anchor sequence is essential of SRP-dependent targeting. Following the successful interaction of the nascent GPR35 and SRP, GPR35 is successfully targeted to ER membranes, shown here as dog pancreas microsomes (DPMs). Glycosylation of the GPR35 N-terminus was used to determine nascent chain structure as it is inserted into the ER membrane. These glycosylation experiments confirm that TM1 has obtained its compacted state whilst residing in the translocon. Finally, a site-specific cross-linking approach using the homobifunctional cysteine cross-linker, BMH, was used to study the lateral integration of GPR35 into the ER. Cross-linking of GPR35 TM1 and TM2 could be detected adjacent to a protein of ~45kDa, believed to be Sec61α. The loss of this adduct, as the nascent chain extends, showed the lateral movement of GPR35 TM1 from the translocon was dependent on the subsequent synthesis of TM2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Schistosomiasis is still an endemic disease in many regions, with 250 million people infected with Schistosoma and about 500,000 deaths per year. Praziquantel (PZQ) is the drug of choice for schistosomiasis treatment, however it is classified as Class II in the Biopharmaceutics Classification System, as its low solubility hinders its performance in biological systems. The use of cyclodextrins is a useful tool to increase the solubility and bioavailability of drugs. The aim of this work was to prepare an inclusion compound of PZQ and methyl-beta-cyclodextrin (MeCD), perform its physico-chemical characterization, and explore its in vitro cytotoxicity. SEM showed a change of the morphological characteristics of PZQ:MeCD crystals, and IR data supported this finding, with changes after interaction with MeCD including effects on the C-H of the aromatic ring, observed at 758 cm(-1). Differential scanning calorimetry measurements revealed that complexation occurred in a 1:1 molar ratio, as evidenced by the lack of a PZQ transition temperature after inclusion into the MeCD cavity. In solution, the PZQ UV spectrum profile in the presence of MeCD was comparable to the PZQ spectrum in a hydrophobic solvent. Phase solubility diagrams showed that there was a 5.5-fold increase in PZQ solubility, and were indicative of a type A(L) isotherm, that was used to determine an association constant (K(a)) of 140.8 M(-1). No cytotoxicity of the PZQ:MeCD inclusion compound was observed in tests using 3T3 cells. The results suggest that the association of PZQ with MeCD could be a good alternative for the treatment of schistosomiasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a climatologically important interaction of two of the main components of the geophysical system by adding an energy balance model for the averaged atmospheric temperature as dynamic boundary condition to a diagnostic ocean model having an additional spatial dimension. In this work, we give deeper insight than previous papers in the literature, mainly with respect to the 1990 pioneering model by Watts and Morantine. We are taking into consideration the latent heat for the two phase ocean as well as a possible delayed term. Non-uniqueness for the initial boundary value problem, uniqueness under a non-degeneracy condition and the existence of multiple stationary solutions are proved here. These multiplicity results suggest that an S-shaped bifurcation diagram should be expected to occur in this class of models generalizing previous energy balance models. The numerical method applied to the model is based on a finite volume scheme with nonlinear weighted essentially non-oscillatory reconstruction and Runge–Kutta total variation diminishing for time integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature on preferences for redistribution has paid little attention to the effect of social mobility on the demand for redistribution, which is in contrast with the literature on class-voting, where studies on the effect of social mobility has been very common. Some works have addressed this issue but no systematic test of the hypotheses connecting social mobility and preferences has been done. In this paper we use the diagonal reference models to estimate the effect of origin and destination class on preferences for redistribution in a sample of European countries using data from the European Social Survey. Our findings indicate that social origin matters to a little extent to explain preferences, as newcomers tend to adopt the preferences of the destination class. Moreover, we have found only limited evidence supporting the acculturation hypothesis and not support for the status maximization hypothesis. Furthermore, the effect of social origin varies largely between countries. In a second step of the analysis we investigate what are the national factors explaining this variation. The empirical evidence we present leads to conclude that high rates of upward social mobility sharply reduce the effect of social origin on preferences for redistribution

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the relationship between normal weight, overweight and obesity class I and II+, and the risk of disability, which is defined as impairment in activities of daily living (ADL). Systematic searching of the literature identified eight cross-sectional studies and four longitudinal studies that were comparable for meta-analysis. An additional four cross-sectional studies and one longitudinal study were included for qualitative review. Results from the meta-analysis of cross-sectional studies revealed a graded increase in the risk of ADL limitations from overweight (1.04, 95% confidence interval [CI] 1.00-1.08), class I obesity (1.16, 95% CI 1.11-1.21) and class II+ obesity (1.76, 95% CI 1.28-2.41), relative to normal weight. Meta-analyses of longitudinal studies revealed a similar graded relationship; however, the magnitude of this relationship was slightly greater for all body mass index categories. Qualitative analysis of studies that met the inclusion criteria but were not compatible for meta-analysis supported the pooled results. No studies identified met all of the pre-defined quality criteria, and subgroup analysis was inhibited due to insufficient comparable studies. We conclude that increasing body weight increases the risk of disability in a graded manner, but also emphasize the need for additional studies using contemporary longitudinal cohorts with large numbers of obese class III individuals, a range of ages and with measured height and weight, and incident ADL questions.