974 resultados para Convexity in Graphs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate implant accuracy and cosmetic outcome of a new intraoperative patient-specific cranioplasty method after convexity meningioma resection. METHODS: The patient's own bone flap served as a template to mold a negative form with the use of polymethyl methacrylate (PMMA). The area of bone invasion was determined and broadly excised under white light illumination with a safety margin of at least 1 cm. The definitive replica was cast within the remaining bone flap frame and the imprint. Clinical and radiologic follow-up examinations were performed 3 months after surgery. RESULTS: Four women and two men (mean age 51.4 years ± 12.8) underwent reconstruction of bone flap defects after meningioma resection. Mean duration of intraoperative reconstruction of the partial bone flap defects was 19 minutes ± 4 (range 14-24 minutes). Implant sizes ranged from 17-35 cm(2) (mean size 22 cm(2) ± 8). Radiologic and clinical follow-up examinations revealed excellent implant alignment and favorable cosmesis (visual analogue scale for cosmesis [VASC] = 97 ± 5) in all patients. CONCLUSIONS: Patient-specific reconstruction of partial bone flap defects after convexity meningioma resection using the presented intraoperative PMMA cast method resulted in excellent bony alignment and a favorable cosmetic outcome. Relatively low costs and minimized operation time for adjustment and insertion of the cranioplasty implant justify use of this method in small bony defects as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A robust, inexpensive, and fully validated CE method for the simultaneous determination of the enantiomers of propafenone (PPF), 5-hydroxy-propafenone (5OH-PPF) and N-despropyl-propafenone (NOR-PPF) in serum and in in vitro media is described. It is based upon liquid-liquid extraction at alkaline pH followed by analysis of the reconstituted extract by CE in presence of a pH 2.0 running buffer composed of 100 mM sodium phosphate, 19% methanol, and 0.6% highly sulfated beta-CD. For each compound, the S-enantiomers are shown to migrate ahead of their antipodes, and the overall run time is about 30 min. Enantiomer levels between 25 and 1000 ng/mL provide linear calibration graphs, and the LOD for all enantiomers is between 10 and 12 ng/mL. The assay is shown to be suitable for the determination of the enantiomers of PPF and its metabolites in in vitro incubations comprising human liver microsomes or single CYP450 enzymes (SUPERSOMES). Incubations with CYP2D6 SUPERSOMES revealed, for the first time, the simultaneous formation of the enantiomers of 5OH-PPF and NOR-PPF with that enzyme. CE data can be used for the evaluation of the enzymatic N-dealkylation and hydroxylation rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A robust CE method for the simultaneous determination of the enantiomers of ketamine and norketamine in equine plasma is described. It is based upon liquid-liquid extraction of ketamine and norketamine at alkaline pH from 1 mL plasma followed by analysis of the reconstituted extract by CE in the presence of a pH 2.5 Tris-phosphate buffer containing 10 mg/mL highly sulfated beta-CD as chiral selector. Enantiomer plasma levels between 0.04 and 2.5 microg/mL are shown to provide linear calibration graphs. Intraday and interday precisions evaluated from peak area ratios (n = 5) at the lowest calibrator concentration are < 8 and < 14%, respectively. The LOD for all enantiomers is 0.01 microg/mL. After i.v. bolus administration of 2.2 mg/kg racemic ketamine, the assay is demonstrated to provide reliable data for plasma samples of ponies under isoflurane anesthesia, of ponies premedicated with xylazine, and of one horse that received romifidine, L-methadone, guaifenisine, and isoflurane. In animals not premedicated with xylazine, the ketamine N-demethylation is demonstrated to be enantioselective. The concentrations of the two ketamine enantiomers in plasma are equal whereas S-norketamine is found in a larger amount than R-norketamine. In the group receiving xylazine, data obtained do not reveal this stereoselectivity.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Modeling method of teaching has demonstrated well--‐documented success in the improvement of student learning. The teacher/researcher in this study was introduced to Modeling through the use of a technique called White Boarding. Without formal training, the researcher began using the White Boarding technique for a limited number of laboratory experiences with his high school physics classes. The question that arose and was investigated in this study is “What specific aspects of the White Boarding process support student understanding?” For the purposes of this study, the White Boarding process was broken down into three aspects – the Analysis of data through the use of Logger Pro software, the Preparation of White Boards, and the Presentations each group gave about their specific lab data. The lab used in this study, an Acceleration of Gravity Lab, was chosen because of the documented difficulties students experience in the graphing of motion. In the lab, students filmed a given motion, utilized Logger Pro software to analyze the motion, prepared a White Board that described the motion with position--‐time and velocity--‐time graphs, and then presented their findings to the rest of the class. The Presentation included a class discussion with minimal contribution from the teacher. The three different aspects of the White Boarding experience – Analysis, Preparation, and Presentation – were compared through the use of student learning logs, video analysis of the Presentations, and follow--‐up interviews with participants. The information and observations gathered were used to determine the level of understanding of each participant during each phase of the lab. The researcher then looked for improvement in the level of student understanding, the number of “aha” moments students had, and the students’ perceptions about which phase was most important to their learning. The results suggest that while all three phases of the White Boarding experience play a part in the learning process for students, the Presentations provided the most significant changes. The implications for instruction are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter 1 is used to introduce the basic tools and mechanics used within this thesis. Most of the definitions used in the thesis will be defined, and we provide a basic survey of topics in graph theory and design theory pertinent to the topics studied in this thesis. In Chapter 2, we are concerned with the study of fixed block configuration group divisible designs, GDD(n; m; k; λ1; λ2). We study those GDDs in which each block has configuration (s; t), that is, GDDs in which each block has exactly s points from one of the two groups and t points from the other. Chapter 2 begins with an overview of previous results and constructions for small group size and block sizes 3, 4 and 5. Chapter 2 is largely devoted to presenting constructions and results about GDDs with two groups and block size 6. We show the necessary conditions are sufficient for the existence of GDD(n, 2, 6; λ1, λ2) with fixed block configuration (3; 3). For configuration (1; 5), we give minimal or nearminimal index constructions for all group sizes n ≥ 5 except n = 10, 15, 160, or 190. For configuration (2, 4), we provide constructions for several families ofGDD(n, 2, 6; λ1, λ2)s. Chapter 3 addresses characterizing (3, r)-regular graphs. We begin with providing previous results on the well studied class of (2, r)-regular graphs and some results on the structure of large (t; r)-regular graphs. In Chapter 3, we completely characterize all (3, 1)-regular and (3, 2)-regular graphs, as well has sharpen existing bounds on the order of large (3, r)- regular graphs of a certain form for r ≥ 3. Finally, the appendix gives computational data resulting from Sage and C programs used to generate (3, 3)-regular graphs on less than 10 vertices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1969, Lovasz asked whether every connected, vertex-transitive graph has a Hamilton path. This question has generated a considerable amount of interest, yet remains vastly open. To date, there exist no known connected, vertex-transitive graph that does not possess a Hamilton path. For the Cayley graphs, a subclass of vertex-transitive graphs, the following conjecture was made: Weak Lovász Conjecture: Every nontrivial, finite, connected Cayley graph is hamiltonian. The Chen-Quimpo Theorem proves that Cayley graphs on abelian groups flourish with Hamilton cycles, thus prompting Alspach to make the following conjecture: Alspach Conjecture: Every 2k-regular, connected Cayley graph on a finite abelian group has a Hamilton decomposition. Alspach’s conjecture is true for k = 1 and 2, but even the case k = 3 is still open. It is this case that this thesis addresses. Chapters 1–3 give introductory material and past work on the conjecture. Chapter 3 investigates the relationship between 6-regular Cayley graphs and associated quotient graphs. A proof of Alspach’s conjecture is given for the odd order case when k = 3. Chapter 4 provides a proof of the conjecture for even order graphs with 3-element connection sets that have an element generating a subgroup of index 2, and having a linear dependency among the other generators. Chapter 5 shows that if Γ = Cay(A, {s1, s2, s3}) is a connected, 6-regular, abelian Cayley graph of even order, and for some1 ≤ i ≤ 3, Δi = Cay(A/(si), {sj1 , sj2}) is 4-regular, and Δi ≄ Cay(ℤ3, {1, 1}), then Γ has a Hamilton decomposition. Alternatively stated, if Γ = Cay(A, S) is a connected, 6-regular, abelian Cayley graph of even order, then Γ has a Hamilton decomposition if S has no involutions, and for some s ∈ S, Cay(A/(s), S) is 4-regular, and of order at least 4. Finally, the Appendices give computational data resulting from C and MAGMA programs used to generate Hamilton decompositions of certain non-isomorphic Cayley graphs on low order abelian groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning in realistic domains typically involves reasoning under uncertainty, operating under time and resource constraints, and finding the optimal subset of goals to work on. Creating optimal plans that consider all of these features is a computationally complex, challenging problem. This dissertation develops an AO* search based planner named CPOAO* (Concurrent, Probabilistic, Over-subscription AO*) which incorporates durative actions, time and resource constraints, concurrent execution, over-subscribed goals, and probabilistic actions. To handle concurrent actions, action combinations rather than individual actions are taken as plan steps. Plan optimization is explored by adding two novel aspects to plans. First, parallel steps that serve the same goal are used to increase the plan’s probability of success. Traditionally, only parallel steps that serve different goals are used to reduce plan execution time. Second, actions that are executing but are no longer useful can be terminated to save resources and time. Conventional planners assume that all actions that were started will be carried out to completion. To reduce the size of the search space, several domain independent heuristic functions and pruning techniques were developed. The key ideas are to exploit dominance relations for candidate action sets and to develop relaxed planning graphs to estimate the expected rewards of states. This thesis contributes (1) an AO* based planner to generate parallel plans, (2) domain independent heuristics to increase planner efficiency, and (3) the ability to execute redundant actions and to terminate useless actions to increase plan efficiency.