866 resultados para Characteristic Initial Value Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Highly swellable polymer films doped with Ag nanoparticle aggregates (poly-SERS films) have been used to record very high signal:noise ratio, reproducible surface-enhanced (resonance) Raman (SER(R)S) spectra of in situ dried ink lines and their constituent dyes using both 633 and 785 nm excitation. These allowed the chemical origins of differences in the SERRS spectra of different inks to be determined. Initial investigation of pure samples of the 10 most common blue dyes showed that the dyes which had very similar chemical structures such as Patent Blue V and Patent Blue VF (which differ only by a single OH group) gave SERRS spectra in which the only indications that the dye structure had been changed were small differences in peak positions or relative intensities of the bands. SERRS studies of 13 gel pen inks were consistent with this observation. In some cases inks from different types of pens could be distinguished even though they were dominated by a single dye such as Victoria Blue B (Zebra Surari) or Victoria Blue BO (Pilot Acroball) because their predominant dye did not appear in other inks. Conversely, identical spectra were also recorded from different types of pens (Pilot G7, Zebra Z-grip) because they all had the same dominant Brilliant Blue G dye. Finally, some of the inks contained mixtures of dyes which could be separated by TLC and removed from the plate before being analysed with the same poly-SERS films. For example, the Pentel EnerGel ink pen was found to give TLC spots corresponding to Erioglaucine and Brilliant Blue G. Overall, this study has shown that the spectral differences between different inks which are based on chemically similar, but nonetheless distinct dyes, are extremely small, so very close matches between SERRS spectra are required for confident identification. Poly-SERS substrates can routinely provide the very stringent reproducibility and sensitivity levels required. This, coupled with the awareness of the reasons underlying the observed differences between similarly coloured inks allows a more confident assessment of the evidential value of inks SERS and should underpin adoption of this approach as a routine method for the forensic examination of inks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sarcoma metastatic to the brain is uncommon and rarely occurs as the initial manifestation of tumor. Alveolar soft part sarcoma (ASPS) is a rare but well-studied subtype of sarcoma. A 39-year-old man presented with seizures due to a left temporal meningeal-enhancing lesion with striking brain edema on MRI. The patient underwent neurosurgical resection for suspected meningioma. Histology showed large tumor cells clustering and forming small nests, in places with pseudoalveolar pattern. Diastase-resistant periodic acid-Schiff revealed very rare granular and rod-like cytoplasmic inclusions. Immunohistochemistry showed convincing positivity only with vimentin and smooth muscle actin. The histological features were strongly suggestive of ASPS. At the molecular level RT-PCR and sequencing analysis demonstrated ASPCR1-TFE3 fusion confirming the histological diagnosis of ASPS. There was no evidence of primary extracranial tumor by physical examination and on chest and abdominal CT scan 11 months after presentation. ASPS typically arise from the soft tissues of the extremities and develop multiple metastatic deposits usually with a long clinical course. This case may represent primary meningeal ASPS although metastatic deposit from an undiscovered primary site cannot be entirely excluded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional heuristic approaches to the Examination Timetabling Problem normally utilize a stochastic method during Optimization for the selection of the next examination to be considered for timetabling within the neighbourhood search process. This paper presents a technique whereby the stochastic method has been augmented with information from a weighted list gathered during the initial adaptive construction phase, with the purpose of intelligently directing examination selection. In addition, a Reinforcement Learning technique has been adapted to identify the most effective portions of the weighted list in terms of facilitating the greatest potential for overall solution improvement. The technique is tested against the 2007 International Timetabling Competition datasets with solutions generated within a time frame specified by the competition organizers. The results generated are better than those of the competition winner in seven of the twelve examinations, while being competitive for the remaining five examinations. This paper also shows experimentally how using reinforcement learning has improved upon our previous technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measures of impact of Higher Education have often neglected the Chinese student view, despite the importance of these students to the UK and Chinese economy. This research paper details the findings of a quantitative survey that was purposively distributed to Chinese graduates who enrolled at the University of Worcester on the Business Management degree between 2004-2011 (n=49). Analysis has been conducted on their skill development throughout their degree, their skill usage in different employment contexts, the value of their degree, and gender differences in skill development and usage. Discrepancies between skill development and usage, between males and females, and with previous research findings are discussed. Future research directions are also specified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The value premium is well established in empirical asset pricing, but to date there is little understanding as to its fundamental drivers. We use a stochastic earnings valuation model to establish a direct link between the volatility of future earnings growth and firm value. We illustrate that risky earnings growth affects growth and value firms differently. We provide empirical evidence that the volatility of future earnings growth is a significant determinant of the value premium. Using data on individual firms and characteristic-sorted test portfolios, we also find that earnings growth volatility is significant in explaining the cross-sectional variation of stock returns. Our findings imply that the value premium is the rational consequence of accounting for risky earnings growth in the firm valuation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O projeto desenvolvido tem como objetivo principal a melhoria da eficiência na prestação de serviços de reparação de chapa e pintura na Caetano Auto Colisão, através da aplicação de ferramentas associadas à filosofia Lean. Apesar das ferramentas e técnicas lean estarem bem exploradas nas empresas de produção e manufatura, o mesmo não se verifica em relação às empresas da área dos serviços. O Value Stream Mapping é uma ferramenta lean que consiste no mapeamento do fluxo de materiais e informação necessários para a realização das atividades (que acrescentam e não acrescentam valor), desempenhadas pelos colaboradores, fornecedores e distribuidores, desde a obtenção do pedido do cliente até à entrega final do serviço. Através desta ferramenta é possível identificar as atividades que não acrescentam valor para o processo e propor medidas de melhoria que resultem na eliminação ou redução das mesmas. Com base neste conceito, foi realizado o mapeamento do processo de prestação de serviços de chapa e pintura e identificados os focos de ineficiência. A partir desta análise foram sugeridas melhorias que têm como objetivo atingir o estado futuro proposto assim como tornar o processo mais eficiente. Duas destas melhorias passaram pela implementação dos 5S na sala das tintas e pela elaboração de um relatório A3 para o centro de lavagens. O projeto realizado permitiu o estudo de um problema real numa empresa de serviços, bem como a proposta de um conjunto de melhorias que a médio prazo se espera virem a contribuir para a melhoria da eficiência na prestação de serviços de chapa e pintura.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the luxury market has entered a period of very modest growth, which has been dubbed the ‘new normal’, where varying tourist flows, currency fluctuations, and shifted consumer tastes dictate the terms. The modern luxury consumer is a fickle mistress. Especially millennials – people born in the 1980s and 1990s – are the embodiment of this new form of demanding luxury consumer with particular tastes and values. Modern consumers, and specifically millennials, want experiences and free time, and are interested in a brand’s societal position and environmental impact. The purpose of this thesis is to investigate what the luxury value perceptions of millennials in higher education are in Europe, seeing as many of the most prominent luxury goods companies in the world originate from Europe. Perceived luxury value is herein examined from the individual’s perspective. As values and value perceptions are complex constructs, using qualitative research methods is justifiable. The data for thesis has been gathered by means of a group interview. The interview participants all study hospitality management in a private college, and each represent a different nationality. Cultural theories and research on luxury and luxury values provide the scientific foundation for this thesis, and a multidimensional luxury value model is used as a theoretical tool in sorting and analyzing the data. The results show that millennials in Europe value much more than simply modern and hard luxury. Functional, financial, individual, and social aspects are all present in perceived luxury value, but some more in a negative sense than others. Conspicuous, status-seeking consumption is mostly frowned upon, as is the consumption of luxury goods for the sake of satisfying social requisites and peer pressure. Most of the positive value perceptions are attributed to the functional dimension, as luxury products are seen to come with a promise of high quality and reliability, which justifies any price premiums. Ecological and ethical aspects of luxury are already a contemporary trend, but perceived even more as an important characteristic of luxury in the future. Most importantly, having time is fundamental. Depending on who is asked, luxury can mean anything, just as much as it can mean nothing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract- A Bayesian optimization algorithm for the nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. Unlike our previous work that used GAs to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. eventually, we will be able to identify and mix building blocks directly. The Bayesian optimization algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Australian forest industries have a long history of export trade of a wide range of products from woodchips (for paper manufacturing), sandalwood (essential oils, carving and incense) to high value musical instruments, flooring and outdoor furniture. For the high value group, fluctuating environmental conditions brought on by changes in temperature and relative humidity, can lead to performance problems due to consequential swelling, shrinkage and/or distortion of the wood elements. A survey determined the types of value-added products exported, including species and dimensions packaging used and export markets. Data loggers were installed with shipments to monitor temperature and relative humidity conditions. These data were converted to timber equilibrium moisture content values to provide an indication of the environment that the wood elements would be acclimatising to. The results of the initial survey indicated that primary high value wood export products included guitars, flooring, decking and outdoor furniture. The destination markets were mainly located in the northern hemisphere, particularly the United States of America, China, Hong Kong, Europe (including the United Kingdom), Japan, Korea and the Middle East. Other regions importing Australian-made wooden articles were south-east Asia, New Zealand and South Africa. Different timber species have differing rates of swelling and shrinkage, so the types of timber were also recorded during the survey. Results from this work determined that the major species were ash-type eucalypts from south-eastern Australia (commonly referred to in the market as Tasmanian oak), jarrah from Western Australia, spotted gum, hoop pine, white cypress, black butt, brush box and Sydney blue gum from Queensland and New South Wales. The environmental conditions data indicated that microclimates in shipping containers can fluctuate extensively during shipping. Conditions at the time of manufacturing were usually between 10 and 12% equilibrium moisture content, however conditions during shipping could range from 5 (very dry) to 20% (very humid). The packaging systems incorporated were reported to be efficient at protecting the wooden articles from damage during transit. The research highlighted the potential risk for wood components to ‘move’ in response to periods of drier or more humid conditions than those at the time of manufacturing, and the importance of engineering a packaging system that can account for the environmental conditions experienced in shipping containers. Examples of potential dimensional changes in wooden components were calculated based on published unit shrinkage data for key species and the climatic data returned from the logging equipment. The information highlighted the importance of good design to account for possible timber movement during shipping. A timber movement calculator was developed to allow designers to input component species, dimensions, site of manufacture and destination, to see validate their product design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: This paper reports a lot-sizing and scheduling problem, which minimizes inventory and backlog costs on m parallel machines with sequence-dependent set-up times over t periods. Problem solutions are represented as product subsets ordered and/or unordered for each machine m at each period t. The optimal lot sizes are determined applying a linear program. A genetic algorithm searches either over ordered or over unordered subsets (which are implicitly ordered using a fast ATSP-type heuristic) to identify an overall optimal solution. Initial computational results are presented, comparing the speed and solution quality of the ordered and unordered genetic algorithm approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract- A Bayesian optimization algorithm for the nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. Unlike our previous work that used GAs to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. eventually, we will be able to identify and mix building blocks directly. The Bayesian optimization algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic cannabinoid receptor agonists or more commonly known as synthetic cannabinoids (SCs) were originally created to obtain the medicinal value of THC but they are an emerging social problem. SCs are mostly produced coated on herbal materials or in powder form and marketed under a variety of brand names, e.g. “Spice”, “K2”. Despite many SCs becoming controlled under drug legislation, many of them remain legal in some countries around the world. In Scotland, SCs are controlled under the Misuse of Drugs Act 1971 and Psychoactive Substances Act 2016 that only cover a few early SCs. In Saudi Arabia, even fewer are controlled. The picture of the SCs-problem in Scotland is vague due to insufficient prevalence data, particularly that using biological samples. Whilst there is evidence of increasing use of SCs throughout the world, in Saudi Arabia, there is currently no data regarding the use of products containing SCs among Saudi people. Several studies indicate that SCs may cause serious toxicity and impairment to health therefore it is important to understand the scale of use within society. A simple and sensitive method was developed for the simultaneous analysis of 10 parent SCs (JWH-018, JWH-073, JWH-250, JWH-200, AM-1248, UR-144, A-796260, AB-FUBINACA, 5F-AKB-48 and 5F-PB-22) in whole blood and 8 corresponding metabolites (JWH-018 4-OH pentyl, JWH-073 3-OH butyl, JWH-250 4-OH pentyl, AM-2201 4-OH pentyl, JWH-122 5-OH pentyl, JWH-210 5-OH pentyl, 5F-AKB-48 (N-4 OH pentyl), 5F-PB-22 3-carboxyindole)in urine using LLE and LC-MS/MS. The method was validated according to the standard practices for method validation in forensic toxicology (SWGTOX, May 2013). All analytes gave acceptable precision, linearity and recovery for analysing blood and urine samples. The method was applied to 1,496 biological samples, a mixture of whole blood and urine. Blood and/or urine samples were analysed from 114 patients presenting at Accident and Emergency in Glasgow Royal Infirmary, in spring 2014 and JuneDecember 2015. 5F-AKB-48, 5F-PB-22 and MDMB-CHMICA were detected in 9, 7 and 9 cases respectively. 904 urine samples from individuals admitted to/liberated from Scottish prisons over November 2013 were tested for the presence of SCs. 5F-AKB-48 (N-4 OH pentyl) was detected in 10 cases and 5F-PB-22 3-carboxyindole in 3 cases. Blood and urine samples from two post-mortem cases in Scotland with suspected ingestion of SCs were analysed. Both cases were confirmed positive for 5F-AKB-48. A total of 463 urine samples were collected from personnel who presented to the Security Forces Hospital in Ryiadh for workplace drug testing as a requirement for their job during July 2014. The results of the analysis found 2 samples to be positive for 5F-PB-22 3carboxyindole. A further study in Saudi Arabia using a questionnaire was carried out among 3 subpopulations: medical professionals, members of the public in and around smoking cafes and known drug users. With regards to general awareness of Spice products, 16%, 11% and 22% of those participants of medical professionals, members of the public in and around smoking cafes and known drug users, respectively, were aware of the existence of SCs or Spice products. The respondents had an overall average of 4.5% who had a friend who used these Spice products. It is clear from the results obtained in both blood and urine testing and surveys that SCs are being used in both Scotland and Saudi Arabia. The extent of their use is not clear and the data presented here is an initial look into their prevalence. Blood and urine findings suggest changing trends in SC use, moving away from JWH and AM SCs to the newer 5F-AKB-48, 5-F-PB-22 and MDMBCHMICA compounds worldwide. In both countries 5F-PB-22 was detected. These findings clarify how the SCs phenomenon is a worldwide problem and how the information of every country regarding what SCs are seized can help and is not specific for that country. The analytes included in the method were selected due to their apparent availability in both countries, however it is possible that some newer analytes have been used and these would not have been detected. For this reason it is important that methods for testing SCs are updated regularly and evolve with the ever-changing availability of these drugs worldwide. In addition, there is little published literature regarding the concentrations of these drugs found in blood and urine samples and this work goes some way towards understanding these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the design and analysis of hp-version discontinuous Galerkin (DG) finite element methods for boundary-value problems involving the biharmonic operator. The first part extends the unified approach of Arnold, Brezzi, Cockburn & Marini (SIAM J. Numer. Anal. 39, 5 (2001/02), 1749-1779) developed for the Poisson problem, to the design of DG methods via an appropriate choice of numerical flux functions for fourth order problems; as an example we retrieve the interior penalty DG method developed by Suli & Mozolevski (Comput. Methods Appl. Mech. Engrg. 196, 13-16 (2007), 1851-1863). The second part of this work is concerned with a new a-priori error analysis of the hp-version interior penalty DG method, when the error is measured in terms of both the energy-norm and L2-norm, as well certain linear functionals of the solution, for elemental polynomial degrees $p\ge 2$. Also, provided that the solution is piecewise analytic in an open neighbourhood of each element, exponential convergence is also proven for the p-version of the DG method. The sharpness of the theoretical developments is illustrated by numerical experiments.