955 resultados para time measurement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Caco-2 cell line has been used as a model to predict the in vitro permeability of the human intestinal barrier. The predictive potential of the assay relies on an appropriate in-house validation of the method. The objective of the present study was to develop a single HPLC-UV method for the identification and quantitation of marker drugs and to determine the suitability of the Caco-2 cell permeability assay. A simple chromatographic method was developed for the simultaneous determination of both passively (propranolol, carbamazepine, acyclovir, and hydrochlorothiazide) and actively transported drugs (vinblastine and verapamil). Separation was achieved on a C18 column with step-gradient elution (acetonitrile and aqueous solution of ammonium acetate, pH 3.0) at a flow rate of 1.0 mL/min and UV detection at 275 nm during the total run time of 35 min. The method was validated and found to be specific, linear, precise, and accurate. This chromatographic system can be readily used on a routine basis and its utilization can be extended to other permeability models. The results obtained in the Caco-2 bi-directional transport experiments confirmed the validity of the assay, given that high and low permeability profiles were identified, and P-glycoprotein functionality was established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The value that the customer perceives from a supplier’s offering, impacts customer’s decision making and willingness to pay at the time of the purchase, and the overall satisfaction. Thus, for a business supplier, it is critical to understand their customers’ value perceptions. The objective of this thesis is to understand what measurement and monitoring system customers value, by examining their key purchasing criteria and perceived benefits. Theoretical part of this study consists on reviewing relevant literature on organizational buying behavior and customer perceived value. This study employs a qualitative interview research method. The empirical part of this research consisted of conducting 20 in-depth interviews with life science customers in USA and in Europe. Quality and technical features are the most important purchasing criteria, while product-related benefits seem to be the most important perceived benefits. At the marketing of the system, the emphasis should be at which regulations the system complies with, references of supplier’s prior experience, the reliability and usability of the system, and total costs. The benefits that should be emphasized are the better control of customer’s process, and the proof of customer’s product quality

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time series analysis has gone through different developmental stages before the current modern approaches. These can broadly categorized as the classical time series analysis and modern time series analysis approach. In the classical one, the basic target of the analysis is to describe the major behaviour of the series without necessarily dealing with the underlying structures. On the contrary, the modern approaches strives to summarize the behaviour of the series going through its underlying structure so that the series can be represented explicitly. In other words, such approach of time series analysis tries to study the series structurally. The components of the series that make up the observation such as the trend, seasonality, regression and disturbance terms are modelled explicitly before putting everything together in to a single state space model which give the natural interpretation of the series. The target of this diploma work is to practically apply the modern approach of time series analysis known as the state space approach, more specifically, the dynamic linear model, to make trend analysis over Ionosonde measurement data. The data is time series of the peak height of F2 layer symbolized by hmF2 which is the height of high electron density. In addition, the work also targets to investigate the connection between solar activity and the peak height of F2 layer. Based on the result found, the peak height of the F2 layer has shown a decrease during the observation period and also shows a nonlinear positive correlation with solar activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marketing and finance are both facing challenges in the constantly changing business environment. Finance is challenged to change its role from cost control to value-adding business partner while marketing needs to be able to demonstrate its accountability so how it contributes to firm performance. Finance is the key partner for marketing to prove its impact by helping marketing to measure its actions. By doing so, finance can also emphasize its business partner role. There is not a lot of research conducted of the relationship between marketing and finance departments. The aim of this study is to investigate how the professional differences of marketing and finance and their forms of cooperation affect marketing performance measurement. Literature of marketing and finance disciplines, their cooperation, performance implications of their interface as well as the roles of marketing performance measurement, performance measurement system and measures were reviewed. This research was conducted as a qualitative case study among senior management of marketing and finance in the sporting goods company. The data collected through semi-structured interviews, participant observation and secondary data was described and classified and connections were made. The results of the study show that the nature of marketing and finance disciplines has many effects on their cooperation and performance measurement. Due to the ambiguous nature of marketing, measuring its performance is still seen as a challenge but digitalization is helping the measurement. It was indicated that marketing and finance professionals need to have different skillsets in order to perform their roles effectively and thus cooperation is needed. Marketing performance needs to be measured with both financial and nonfinancial measures. Both marketing and finance interviewees highlighted the importance of marketing measures over financial measures. Measuring marketing performance comprehensively is seen as a challenge since marketing and finance cooperation is still shaped by the cost control and budget management roles, rather than performance measurement. We recognized three constraints affecting this cooperation and performance measurement: people, time and software. If marketing and finance would develop deeper cooperation, they could create comprehensive performance measurement system that improves organizational performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study investigates the usefulness of a multi-method approach to the measurement of reading motivation and achievement. A sample of 127 elementary and middle-school children aged 10 to 14 responded to measures of motivation, attributions, and achievement both longitudinally and in a challenging reading context. Novel measures of motivation and attributions were constructed, validated, and utilized to examine the relationship between ~ motivation, attributions, and achievement over a one-year period (Study I). The impact of classroom contexts and instructional practices was also explored through a study of the influence of topic interest and challenge on motivation, attributions, and persistence (Study II), as well as through interviews with children regarding motivation and reading in the classroom (Study III). Creation and validation of novel measures of motivation and attributions supported the use of a self-report measure of motivation in situation-specific contexts, and confirmed a three-factor structure of attributions for reading performance in both hypothetical and situation-specific contexts. A one-year follow up study of children's motivation and reading achievement demonstrated declines in all components of motivation beginning at age 10 through 12, and particularly strong decreases in motivation with the transition to middle school. Past perceived competence for reading predicted current achievement after controlling for past achievement, and showed the strongest relationships with reading-related skills in both elementary and middle school. Motivation and attributions were strongly related, and children with higher motivation Fulmer III displayed more adaptive attributions for reading success and failure. In the context of a developmentally inappropriate challenging reading task, children's motivation for reading, especially in terms of perceived competence, was threatened. However, interest in the story buffered some ofthe negative impacts of challenge, sustaining children's motivation, adaptive attributions, and reading persistence. Finally, children's responses during interviews outlined several emotions, perceptions, and aspects of reading tasks and contexts that influence reading motivation and achievement. Findings revealed that children with comparable motivation and achievement profiles respond in a similar way to particular reading situations, such as excessive challenge, but also that motivation is dynamic and individualistic and can change over time and across contexts. Overall, the present study outlines the importance of motivation and adaptive attributions for reading success, and the necessity of integrating various methodologies to study the dynamic construct of achievement motivation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lichenologists and users of lichenometry have long used calipers or photogrammetry to measure the growth of crustose lichens. Now, digital photography and popular computer software provide methodological alternatives. This thesis developed and tested a new methodology for tracking change and growth of the lichen, Rhizocarpon geographicum. Adobe Photoshop CS3 Extended software and a photographic time series (1996,2003,2006 and 2007) were used to measure thallus diameter, area, prothallus width and areolae area in 115 small R. geographicum thalli (0.53-1049.88 mm2 ). Measures of 8 diameters per thallus showed that change in diameter was highly variable and is a weak index of growth. Thallus area was a reliable measure of growth (power correlation, R2 = 0.89). Rapid, highly irregular growth occurred in small thalli «30 mm2 ), and steady, uniform growth occurred in larger thalli (>30 mm2 ). This new methodology is tedious but can potentially generate accurate and precise measures for even the tiniest of lichens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the measurement of individual poverty in an intertemporal context. In contrast to earlier contributions, we assign importance to the persistence in a state of poverty and we characterize a class of individual intertemporal poverty measures reflecting this feature. In addition, we axiomatize an aggregation procedure to obtain intertemporal poverty measures for entire societies and we illustrate our new indices with an application to EU countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En opération depuis 2008, l’expérience ATLAS est la plus grande de toutes les expériences au LHC. Les détecteurs ATLAS- MPX (MPX) installés dans ATLAS sont basés sur le détecteur au silicium à pixels Medipix2 qui a été développé par la collaboration Medipix au CERN pour faire de l’imagerie en temps réel. Les détecteurs MPX peuvent être utilisés pour mesurer la luminosité. Ils ont été installés à seize différents endroits dans les zones expérimentale et technique d’ATLAS en 2008. Le réseau MPX a recueilli avec succès des données indépendamment de la chaîne d’enregistrement des données ATLAS de 2008 à 2013. Chaque détecteur MPX fournit des mesures de la luminosité intégrée du LHC. Ce mémoire décrit la méthode d’étalonnage de la luminosité absolue mesurée avec les détectors MPX et la performance des détecteurs MPX pour les données de luminosité en 2012. Une constante d’étalonnage de la luminosité a été déterminée. L’étalonnage est basé sur technique de van der Meer (vdM). Cette technique permet la mesure de la taille des deux faisceaux en recouvrement dans le plan vertical et horizontal au point d’interaction d’ATLAS (IP1). La détermination de la luminosité absolue nécessite la connaissance précise de l’intensité des faisceaux et du nombre de trains de particules. Les trois balayages d’étalonnage ont été analysés et les résultats obtenus par les détecteurs MPX ont été comparés aux autres détecteurs d’ATLAS dédiés spécifiquement à la mesure de la luminosité. La luminosité obtenue à partir des balayages vdM a été comparée à la luminosité des collisions proton- proton avant et après les balayages vdM. Le réseau des détecteurs MPX donne des informations fiables pour la détermination de la luminosité de l’expérience ATLAS sur un large intervalle (luminosité de 5 × 10^29 cm−2 s−1 jusqu’à 7 × 10^33 cm−2 s−1 .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full Text / Article complet

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STUDY DESIGN: Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. OBJECTIVE: To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. SUMMARY OF BACKGROUND DATA: To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. METHODS: Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. RESULTS: The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). CONCLUSION: This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The photoacoustic technique under heat transmission configuration is used to determine the effect of doping on both the thermal and transport properties of p- and n-type GaAs epitaxial layers grown on GaAs substrate by the molecular beam epitaxial method. Analysis of the data is made on the basis of the theoretical model of Rosencwaig and Gersho. Thermal and transport properties of the epitaxial layers are found by fitting the phase of the experimentally obtained photoacoustic signal with that of the theoretical model. It is observed that both the thermal and transport properties, i.e. thermal diffusivity, diffusion coefficient, surface recombination velocity and nonradiative recombination time, depend on the type of doping in the epitaxial layer. The results clearly show that the photoacoustic technique using heat transmission configuration is an excellent tool to study the thermal and transport properties of epitaxial layers under different doping conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine tool chatter is an unfavorable phenomenon during metal cutting, which results in heavy vibration of cutting tool. With increase in depth of cut, the cutting regime changes from chatter-free cutting to one with chatter. In this paper, we propose the use of permutation entropy (PE), a conceptually simple and computationally fast measurement to detect the onset of chatter from the time series using sound signal recorded with a unidirectional microphone. PE can efficiently distinguish the regular and complex nature of any signal and extract information about the dynamics of the process by indicating sudden change in its value. Under situations where the data sets are huge and there is no time for preprocessing and fine-tuning, PE can effectively detect dynamical changes of the system. This makes PE an ideal choice for online detection of chatter, which is not possible with other conventional nonlinear methods. In the present study, the variation of PE under two cutting conditions is analyzed. Abrupt variation in the value of PE with increase in depth of cut indicates the onset of chatter vibrations. The results are verified using frequency spectra of the signals and the nonlinear measure, normalized coarse-grained information rate (NCIR).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement is the act or the result of a quantitative comparison between a given quantity and a quantity of the same kind chosen as a unit. It is generally agreed that all measurements contain errors. In a measuring system where both a measuring instrument and a human being taking the measurement using a preset process, the measurement error could be due to the instrument, the process or the human being involved. The first part of the study is devoted to understanding the human errors in measurement. For that, selected person related and selected work related factors that could affect measurement errors have been identified. Though these are well known, the exact extent of the error and the extent of effect of different factors on human errors in measurement are less reported. Characterization of human errors in measurement is done by conducting an experimental study using different subjects, where the factors were changed one at a time and the measurements made by them recorded. From the pre‐experiment survey research studies, it is observed that the respondents could not give the correct answers to questions related to the correct values [extent] of human related measurement errors. This confirmed the fears expressed regarding lack of knowledge about the extent of human related measurement errors among professionals associated with quality. But in postexperiment phase of survey study, it is observed that the answers regarding the extent of human related measurement errors has improved significantly since the answer choices were provided based on the experimental study. It is hoped that this work will help users of measurement in practice to better understand and manage the phenomena of human related errors in measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compositional data, also called multiplicative ipsative data, are common in survey research instruments in areas such as time use, budget expenditure and social networks. Compositional data are usually expressed as proportions of a total, whose sum can only be 1. Owing to their constrained nature, statistical analysis in general, and estimation of measurement quality with a confirmatory factor analysis model for multitrait-multimethod (MTMM) designs in particular are challenging tasks. Compositional data are highly non-normal, as they range within the 0-1 interval. One component can only increase if some other(s) decrease, which results in spurious negative correlations among components which cannot be accounted for by the MTMM model parameters. In this article we show how researchers can use the correlated uniqueness model for MTMM designs in order to evaluate measurement quality of compositional indicators. We suggest using the additive log ratio transformation of the data, discuss several approaches to deal with zero components and explain how the interpretation of MTMM designs di ers from the application to standard unconstrained data. We show an illustration of the method on data of social network composition expressed in percentages of partner, family, friends and other members in which we conclude that the faceto-face collection mode is generally superior to the telephone mode, although primacy e ects are higher in the face-to-face mode. Compositions of strong ties (such as partner) are measured with higher quality than those of weaker ties (such as other network members)