Commit f9cc0267 authored by Leo Gao's avatar Leo Gao
Browse files

Use hashed version stability test instead

parent 10d4b64a
d77f3f68aadd6cbf1290c2f6737b2ed5d5c2a60e4c81a65c280f207783caabe1
\ No newline at end of file
[["537 U.S. 900\nHAMMONDv.UNITED STATES.\nNo. 02-5192.\nSupreme Court of United States.\nOctober 7, 2002.\n\n1\nCERTIORARI TO THE UNITED STATES COURT OF APPEALS FOR THE FOURTH CIRCUIT.\n\n\n2\nC. A. 4th Cir. Certiorari denied. Reported below: 286 F. 3d 189.\n\n"], [" RECOMMENDED FOR FULL-TEXT PUBLICATION\n Pursuant to Sixth Circuit Rule 206\n File Name: 11a0288p.06\n\n UNITED STATES COURT OF APPEALS\n FOR THE SIXTH CIRCUIT\n _________________\n\n\n X\n -\n REGINALD BROOKS,\n -\n Petitioner-Appellant,\n -\n -\n No. 11-4142\n v.\n ,\n >\n -\n Respondent-Appellee. -\n DAVID BOBBY, Warden,\n -\n N\n Appeal from the United States District Court\n for the Northern District of Ohio at Cleveland.\n No. 01-02461\u2014Donald C. Nugent, District Judge.\n Decided and Filed: November 9, 2011\n Before: SUTTON, McKEAGUE and GRIFFIN, Circuit Judges.\n\n _________________\n\n COUNSEL\nON BRIEF: Michael J. Benza, Chagrin Falls, Ohio, Alan C. Rossman, OFFICE OF\nTHE FEDERAL PUBLIC DEFENDER, Cleveland, Ohio, for Petitioner. Thomas E.\nMadden, Stephen E. Maher, OFFICE OF THE OHIO ATTORNEY GENERAL,\nColumbus, Ohio, for Appellee. William S. Lazarow, Columbus, Ohio, for Amicus\nCuriae.\n _________________\n\n OPINION\n _________________\n\n PER CURIAM. Reginald Brooks murdered his three sons as they lay sleeping\nin their Cleveland home on the morning of March 6, 1982, two days after his wife served\nhim with divorce papers. An Ohio court sentenced Brooks to death for the crimes. After\nunsuccessfully challenging his conviction and sentence on direct appeal and collateral\nreview in state court, Brooks filed a petition for a writ of habeas corpus. See 28 U.S.C.\n\u00a7 2254. The district court denied Brooks\u2019 petition, and we affirmed. Brooks v. Bagley,\n\n\n 1\n\fNo. 11-4142 Brooks v. Bobby Page 2\n\n\n513 F.3d 618, 632 (6th Cir. 2008). On March 1, 2011, the State of Ohio scheduled\nBrooks\u2019 execution for November 15, 2011.\n\n On September 23, 2011, Brooks filed a motion in the district court to reopen his\nhabeas proceeding under Rule 60(b)(6) of the Federal Rules of Civil Procedure. Brooks\nalleged (1) that his two habeas attorneys were ineffective because they did not\nadequately investigate and present all possible claims, and (2) that one of his habeas\nattorneys, Kevin Spellacy, was particularly ineffective because he labored under a\nconflict of interest, namely that Spellacy\u2019s father, a state court judge, denied some of\nBrooks\u2019 claims on state collateral review. The district court denied Brooks\u2019 Rule 60(b)\nmotion and an accompanying motion to stay his execution on October 19, 2011. Brooks\nappealed the district court\u2019s order and filed a motion in this court to stay his execution.\n\n We apply a four-factor test in resolving such stay motions: \u201c(1) whether there\nis a likelihood he will succeed on the merits of the appeal; (2) whether there is a\nlikelihood he will suffer irreparable harm absent a stay; (3) whether the stay will cause\nsubstantial harm to others; and (4) whether the [stay] would serve the public interest.\u201d\nBedford v. Bobby, 645 F.3d 372, 375 (6th Cir. 2011). In addition to these four\nconsiderations, we also consider the timeliness of the petitioner\u2019s claims. \u201c[T]here is a\nstrong equitable presumption against the grant of a stay where a claim could have been\nbrought at such a time as to allow consideration of the merits without requiring entry of\na stay.\u201d Nelson v. Campbell, 541 U.S. 637, 650 (2004).\n\n There are five problems with this stay request. First, Brooks waited too long to\nfile his Rule 60(b) motion. He has known about the factual underpinnings of his\nconflict-of-interest argument since at least July 24, 2006, when he filed a motion in this\ncourt to remand his first federal habeas petition to the district court based on Mr.\nSpellacy\u2019s conflict of interest (which we denied). And he has known about his habeas\ncounsel\u2019s alleged failure to investigate and present his claims since at least June 14,\n2006, when he filed a motion in this court seeking a certificate of appealability on the\nissue (which we also denied). Brooks could have filed a Rule 60(b) motion in the\ndistrict court raising these claims at any time during the last five years, but instead he\n\fNo. 11-4142 Brooks v. Bobby Page 3\n\n\nwaited until September 23, 2011\u201453 days before his scheduled execution\u2014to do so.\nBrooks offers no justification for this delay. This unexplained, and seemingly\ninexplicable, delay in filing his motion by itself justifies denying the stay. See Nelson,\n541 U.S. at 650; Bedford, 645 F.3d at 375\u201377.\n\n Second, even if these claims were not late, the law-of-the-case doctrine bars\nthem. See United States v. Haynes, 468 F.3d 422, 426 (6th Cir. 2006). Brooks raised\nboth claims in his first federal habeas appeal to this court and in his certiorari petition\nto the United States Supreme Court. See Petition for Certificate of Appealability (June\n14, 2006) at 150\u201365; Motion to Remand (July 24, 2006); Brief on the Merits (July 18,\n2007) at 74\u201393; Petition for Rehearing and En Banc Review (Feb. 19, 2008) at 13\u201315;\nPetition for a Writ of Certiorari (Nov. 6, 2008). We rejected them. A litigant may not\nraise arguments during the first federal habeas proceeding, lose those arguments\n(because he could not show prejudice), then raise the same arguments based on the same\nevidence in a Rule 60(b) motion.\n\n Third, and relatedly, even if the claims were not late and even if they were not\npreviously litigated, Brooks cannot overcome the bar on second or successive habeas\npetitions. When a Rule 60(b) motion \u201cseeks to add a new ground for relief,\u201d whether\nakin to or different from the claims raised in the first petition, the courts generally treat\nit as a second or successive petition. Gonzalez v. Crosby, 545 U.S. 524, 532 (2005). As\nan example of just such a motion, Gonzalez cited Harris v. United States, 367 F.3d 74\n(2d Cir. 2004), in which a habeas petitioner sought to reopen his habeas proceeding\nbecause his attorney ineffectively failed to raise a claim. Id. at 80. Brooks wants to do\njust that\u2014to reopen his habeas proceeding so that he can litigate claims that the alleged\nineffectiveness of his attorneys prevented him from fully litigating in the first habeas go-\nround. Not all Rule 60(b) motions in habeas cases, to be sure, amount to successive\npetitions. Those that do not seek to add a new ground for relief but instead raise \u201csome\ndefect in the integrity of the federal habeas proceeding,\u201d such as \u201c[f]raud on the federal\nhabeas court,\u201d are not successive petitions. Gonzalez, 545 U.S. at 532 & n.5. But an\n\u201cattack based on . . . habeas counsel\u2019s omissions\u201d\u2014just what Brooks raises\n\fNo. 11-4142 Brooks v. Bobby Page 4\n\n\nhere\u2014\u201cordinarily does not go to the integrity of the [earlier federal] proceedings.\u201d Id.\nat 532 n.5. Neither of Brooks\u2019 ineffective-assistance claims amounts to fraud on the\ncourt. And the claims, as presented, do not undermine the \u201cintegrity\u201d of the first federal\nhabeas proceeding. The first theory\u2014general ineffective assistance of habeas\ncounsel\u2014is a plain-vanilla successive petition designed to do nothing more than attack\nhis earlier counsel\u2019s omissions. See id. at 532; Post v. Bradshaw, 422 F.3d 419, 424\u201325\n(6th Cir. 2005). If the successive-petition bar does not limit this theory, it limits nothing.\nIt is possible that the second theory\u2014that a conflict of interest led to the ineffective\nassistance of one of his habeas counsel\u2014could under sufficiently egregious conditions\nhaunt the integrity of a first federal habeas proceeding. But that is not so here. There\nwere two counsel, not one, and both counsel challenged the relevant state court rulings.\nPerhaps more importantly, the issue came to light during the appeal from the first\nproceeding, making it difficult to say that a second habeas proceeding is needed to\ncorrect the integrity of the first proceeding. Brooks nowhere attempts to argue that he\ncan satisfy the requirements for allowing a successive petition. See 28 U.S.C.\n\u00a7 2244(b)(2). Nor do we see any way he could do so.\n\n Fourth, even if the claims were not late, law-of-the-case barred and successive-\npetition barred, they fail to account for a pertinent congressional directive. \u201cThe\nineffectiveness or incompetence of counsel during Federal or State collateral post-\nconviction proceedings shall not be a ground for relief in a proceeding arising under\nsection 2254.\u201d 28 U.S.C. \u00a7 2254(i). Our court, not surprisingly, has construed this\nlanguage to mean what it says: to bar a Rule 60(b) motion based on the ineffectiveness\nof habeas counsel. Post, 422 F.3d at 423. Brooks has not challenged the\nconstitutionality of \u00a7 2254(i), and we see no way to escape the clutches of this statutory\nbar, at least when it comes to his conventional ineffective-assistance claim and possibly\n(we need not decide) with respect to his conflict-of-interest claim.\n\n Fifth, even if we overlooked all of these problems, Brooks has no chance of\nsuccess on the merits with respect to either theory, which alone suffices to reject the\nstay. See Workman v. Bredesen, 486 F.3d 896, 911 (6th Cir. 2007). Brooks claims that\n\fNo. 11-4142 Brooks v. Bobby Page 5\n\n\nhis two habeas attorneys did not diligently investigate and present his claims, but they\nfiled an exhaustive 73-page habeas petition raising 20 claims and a still-more-exhaustive\n136-page reply brief responding to the State\u2019s arguments. True, they did not raise every\nconceivable claim. But \u201cwinnowing out weaker arguments . . . and focusing on those\nmore likely to prevail, far from being evidence of incompetence, is the hallmark of\neffective . . . advocacy.\u201d Smith v. Murray, 477 U.S. 527, 536 (1986). Making matters\nworse, Brooks to this day does not tell us what unraised claim had a meaningful chance\nof prevailing, a deficiency that necessarily establishes a lack of prejudice.\n\n Brooks\u2019 conflict-of-interest claim concerning Kevin Spellacy\u2014itself a species\nof an ineffective-assistance claim, see Cuyler v. Sullivan, 446 U.S. 335, 345 (1980)\n(characterizing a conflict-of-interest claim as one alleging denial of \u201ceffective assistance\nof counsel\u201d)\u2014fares no better. To prevail on this claim, Brooks must demonstrate an\nactual conflict of interest that adversely affected his attorney\u2019s performance. Mickens\nv. Taylor, 535 U.S. 162, 172 n.5 (2002); Cuyler, 446 U.S. at 350. Even if we assume\nthat Spellacy had a conflict because his father was one of the many state court judges\nwho rejected Brooks\u2019 collateral claims, Brooks has not shown that the conflict adversely\naffected Spellacy\u2019s performance by \u201cinfluenc[ing] . . . his basic strategic decisions.\u201d\nMickens, 535 U.S. at 172. Brooks submits that family loyalty prevented Spellacy from\nchallenging the state-court decisions (1) rejecting Brooks\u2019 ineffective-assistance-of-\nappellate-counsel claim and (2) denying Brooks\u2019 request for a competency evaluation\nat state expense during his state collateral challenge. But the habeas petition that\nSpellacy (and his conflict-free co-counsel Patrick D\u2019Angelo) filed in the district court\nshowed no such restraint. They filed an ineffective-assistance-of-appellate counsel\nclaim, R. 21 at 39\u201349, which the district court rejected on the merits. R. 30 at 89\u201391.\nAlthough Brooks\u2019 two habeas counsel did not challenge the denial of a state-funded\ncompetency evaluation, they had a good reason for opting not to do so, quite apart from\nfamily loyalty. Any challenge would face a long and steep climb. As the Ohio Supreme\nCourt noted in rejecting the same request, Brooks\u2019 competency at that stage was \u201csimply\nnot relevant\u201d because the Federal and State Constitutions do not require that a defendant\nbe competent to pursue state post-conviction relief. State v. Brooks, 751 N.E.2d 1040,\n\fNo. 11-4142 Brooks v. Bobby Page 6\n\n\n1042 (Ohio 2001). Brooks offers no United States Supreme Court precedent to the\ncontrary or for that matter any federal authority to support the idea that the United States\nConstitution requires a State to pay for a competency evaluation during state collateral\nreview. Because Brooks cannot show that Spellacy\u2019s purported conflict adversely\naffected his performance, he is not entitled to a presumption of prejudice. See Strickland\nv. Washington, 466 U.S. 668, 692 (1980). Nor has he pointed to any prejudice on his\nown. Indeed, during Brooks\u2019 first appeal we denied his motion to remand his habeas\npetition based on this alleged conflict of interest because he did not show any prejudice\nthen either. See Court Order (Oct. 31, 2006). Nothing has changed.\n\n None of Brooks\u2019 contrary arguments overcomes these problems. Brooks claims\nthat Post was wrong to conclude that 28 U.S.C. \u00a7 2254(i) bars this kind of challenge to\nthe performance of habeas counsel. 422 F.3d at 423. But Post merely applied the\nstatute\u2019s straightforward language, giving it the meaning it unambiguously has. Brooks\u2019\nhabeas petition is \u201ca proceeding arising under section 2254,\u201d and his Rule 60(b) motion\nseeks \u201crelief\u201d based on \u201c[t]he ineffectiveness or incompetence of counsel during Federal\n. . . collateral post-conviction proceedings.\u201d That is just what \u00a7 2254(i) bars.\n\n Brooks adds that we should stay his execution pending the Supreme Court\u2019s\ndecision in Martel v. Clair. But the issue in Clair\u2014the standard that a federal court must\napply when a habeas petitioner seeks substitute counsel under 18 U.S.C. \u00a7 3599\u2014has\nno bearing on Brooks\u2019 Rule 60(b) motion. See Clair v. Ayers, 403 F. App\u2019x 276, 277\u201378\n(9th Cir. 2010); Brief for Petitioner at i, 13, Martel v. Clair, No. 10-1265 (U.S. Sept. 9,\n2011), 2011 WL 369341. The Court\u2019s decision in Clair is unlikely to say anything about\nany of the problems dogging Brooks\u2019 motion, let alone all five of them. (In his reply\nbrief, Brooks also suggests that we should wait on the Supreme Court\u2019s decisions in\nMartinez v. Ryan and Maples v. Thomas, Reply Br. at 10\u201311, but those cases deal only\nwith state post-conviction proceedings and likely will say nothing about the duties of\nfederal habeas counsel).\n\n When all is said and done, Brooks seeks an unusual remedy\u2014a second first\nfederal habeas hearing. That he seeks to do so on ineffective-assistance grounds makes\n\fNo. 11-4142 Brooks v. Bobby Page 7\n\n\nthe request all the more rare. The Constitution does not guarantee counsel on collateral\nreview, see Pennsylvania v. Finley, 481 U.S. 551, 555 (1987), and 28 U.S.C. \u00a7 2254(i)\nprohibits habeas claims premised on the ineffective assistance of habeas counsel. Even\nif allegations of a conflict of interest (affecting one of two habeas lawyers) could alter\nor suspend many of these tenets of federal habeas law, a point we need not decide, we\nsee no cognizable path for granting a second first federal habeas proceeding in the\nabsence of a showing that the conflict of interest mattered\u2014that it adversely affected the\nrepresentation provided to the claimant. Indeed, even a presumption of prejudice, the\nsupposed holy grail of Brooks\u2019 claim, must account for what the conflicted and un-\nconflicted lawyer did. To this day, Brooks\u2019 new (highly effective) counsel have\nunearthed nothing that offers any hint that a second federal habeas proceeding would\ncome out differently from the first federal habeas proceeding.\n\n For these reasons, we deny Brooks\u2019 motion for a stay of execution.\n\f"], ["\r\n\r\nSuarez v Four Thirty Realty LLC (2019 NY Slip Op 01307)\r\n\r\n\r\n\r\n\n\nSuarez v Four Thirty Realty LLC\n\n\n2019 NY Slip Op 01307\n\n\nDecided on February 21, 2019\n\n\nAppellate Division, First Department\n\n\nPublished by New York State Law Reporting Bureau pursuant to Judiciary Law \u00a7 431.\n\n\nThis opinion is uncorrected and subject to revision before publication in the Official Reports.\n\n\n\r\nDecided on February 21, 2019\r\n\r\nRenwick, J.P., Tom, Singh, Moulton, JJ.\r\n\r\n\n8455 160035/15\r\n\r\n[*1]Mario Suarez, et al., Plaintiffs-Respondents-Appellants,\r\nvFour Thirty Realty LLC, et al., Defendants-Appellants-Respondents.\r\n\n\nHoring, Welikson & Rosen, P.C., Williston Park (Niles C. Welikson of counsel), for appellants-respondents.\nSokolski & Zekaria, P.C., New York (Robert E. Sokolski of counsel), for respondents-appellants.\n\nOrder, Supreme Court, New York County (Nancy M. Bannon, J.), entered January 23, 2018, which, to the extent appealed from, denied defendants' motion for summary judgment dismissing the complaint as against Four Thirty Realty LLC, and granted plaintiffs' cross motion for summary judgment dismissing the fourth and fifth affirmative defenses and the sixth except as it applies to the third cause of action, and denied the cross motion for summary judgment declaring that the apartment is rent stabilized and that plaintiff Suarez is a rent stabilized tenant thereof, unanimously modified, on the law, to grant plaintiffs' motion to the extent of declaring that apartment 9H in the building located at 430 East 86th Street in Manhattan is a rent-stabilized unit and that plaintiff Suarez is entitled to a rent-stabilized lease, and otherwise affirmed, without costs.\nDefendants argue that plaintiffs' claims, that the subject apartment was improperly removed from rent stabilization, and for a rent-overcharge and attorneys' fees, are barred by the doctrine of collateral estoppel (see Gersten v 56 7th Ave. LLC, 88 AD3d 189, 201 [1st Dept 2011], appeal withdrawn 18 NY3d 954 [2012]). Defendants are correct that plaintiffs had a full and fair opportunity to participate in the proceedings held 13 years earlier before New York State Division of Housing and Community Renewal (DHCR) that resulted in the deregulation of plaintiffs' apartment pursuant to high income deregulation law. However, plaintiffs' present claims raise an issue that was not raised or litigated in the prior DHCR deregulation proceedings, i.e., whether their apartment was subject to re-regulation when they entered into a new market rate lease at a time when the building was still receiving J-51 tax benefits (see Leight v W7879 LLC, 128 AD3d 417 [1st Dept 2015], affd 27 NY3d 929 [2016]; Extell Belnord LLC v Uppman, 113 AD3d 1, 11 [1st Dept 2013]).\nAccordingly, Supreme Court should have granted plaintiffs' motion for summary judgment declaring that apartment 9H in the building located at 430 East 86th Street in Manhattan is a rent-stabilized unit and that plaintiff Suarez is entitled to a rent-stabilized lease (see Roberts v Tishman Speyer Props, L.P. 13 NY3d 270 [2009] [apartments restored to rent stabilization because the owner deregulated the apartments pursuant to the luxury decontrol laws while it was receiving tax benefits under the City's J-51 program; Gersten v 56 7th Ave. LLC., 88 [*2]AD3d at 198 [applying Roberts retroactively]).\nWe have considered the parties' remaining arguments for affirmative relief and find them unavailing.\nTHIS CONSTITUTES THE DECISION AND ORDER\nOF THE SUPREME COURT, APPELLATE DIVISION, FIRST DEPARTMENT.\nENTERED: FEBRUARY 21, 2019\nCLERK\n\n\n"], ["\n728 S.W.2d 497 (1987)\n292 Ark. 116\nHaskell Wayne SNELGROVE, Appellant,\nv.\nSTATE of Arkansas, Appellee.\nNo. CR 86-224.\nSupreme Court of Arkansas.\nMay 4, 1987.\n*498 Pruitt & Hodnett by Roger T. Jeremiah, Fort Smith, for appellant.\nSteve Clark, Atty. Gen. by Clint Miller, Asst. Atty. Gen., Little Rock, for appellee.\nDUDLEY, Justice.\nAppellant, Haskell Wayne Snelgrove, was originally charged with capital murder. The information alleged that he caused the deaths of his mother and his wife in the course of the same criminal episode. As a result of plea bargaining, the capital charge was reduced to two counts of first degree murder. He entered pleas of nolo contendere and was sentenced to life imprisonment in each case. In a post-conviction proceeding, he now collaterally attacks both convictions pursuant to A.R.Cr.P. Rule 37. We affirm the trial court's denial of relief.\nAppellant contends the trial court erred in accepting his pleas of nolo contendere because there was no recitation of the allegations which formed the basis for the pleas and because the court did not require him to state personally that there was a factual basis for the pleas.\nA.R.Cr.P. Rule 24.6 provides:\nRULE 24.6. Determining Accuracy of Plea.\n\nThe court shall not enter a judgment upon a plea of guilty or nolo contendere without making such inquiry as will establish that there is a factual basis for the plea.\n(Emphasis added.)\nCompliance with Rule 24 is mandatory, but substantial compliance will suffice. McDaniel v. State, 288 Ark. 629, 708 S.W.2d 613 (1986). Further, if the factual basis is not sufficiently determined during the plea proceedings, it may be established at the Rule 37 post-conviction hearing. Davis v. State, 267 Ark. 507, 592 S.W.2d 118 (1980). Appellant is correct in stating that the following colloquy, which took place at the time of the pleas, did not establish a factual basis for his plea:\nTHE COURT: Is there a factual basis for the pleas, Mr. Marquette?\nMR. MARQUETTE [APPELLANT'S ATTORNEY]: Yes, Sir.\nTHE COURT: Mr. Fields?\nMR. FIELDS [PROSECUTING ATTORNEY]: Yes, Sir.\nHowever, the deficiencies in establishing a factual basis were supplied in other responses and at the post-conviction hearing.\n*499 At the post-conviction hearing, appellant admitted that before he entered his pleas of nolo contendere, he attended an evidence suppression hearing and there heard the trial court rule on the admissibility of the following evidence:\n(1) The testimony of a minister to whom appellant had confessed that he was a beast-man who ate raw flesh and drank blood, and that demons and devils had told him to kill his mother and his wife;\n(2) A letter written by his mother which expressed fear of appellant;\n(3) Evidence which showed the victims were stabbed to death;\n(4) A pocket knife, which was found in appellant's possession, had human blood on it, and the size of the blade was consistent with the width and depth of the stab wounds in the victims;\n(5) A medical report which established that semen, which could have been appellant's, was found in his mother; and\n(6) Evidence about prior convictions for rape which involved two women, one of whom was made to lie in a bathtub filled with gasoline, while appellant forced the other to have sexual intercourse by threatening to throw a lighted match into the gasoline.\nThe trial judge reserved ruling on the mother's letter and the prior rape convictions, but refused to suppress the other evidence. Obviously, in allowing the appellant to enter the pleas, the trial court was aware of these facts, as was appellant. The attorneys who represented appellant testified at the Rule 37 hearing that they explained the nature of the crimes to appellant, discussed all of the evidence with him, gave him a scenario of how they thought the trial would proceed, and discussed the possible sentences.\nAt the plea hearing the trial court determined that:\n(1) Appellant knew the nature of the charges;\n(2) Appellant knew the possible sentences;\n(3) Appellant understood his nolo contendere pleas;\n(4) Appellant's attorneys had explained the plea statement to him four times;\n(5) Appellant understood he was giving up his right to appeal, to be tried by a jury, to cross-examine witnesses and to testify;\n(6) Appellant understood he was to receive two life sentences, and;\n(7) Appellant was not coerced into making the pleas.\nStandard 14-1.6 of the American Bar Association's Standards for Criminal Justice defines the requirement of a factual basis as follows: \"The requirement of a factual basis refers to the presence of sufficient evidence, adduced at the taking of a guilty plea or plea of nolo contendere, upon which a judge may fairly conclude that a defendant could be convicted if the defendant elected to stand trial.\"\nThe record from the plea hearing and the post-conviction hearing establish that there was a factual basis for the pleas and there was sufficient evidence from which the trial court could conclude that appellant would be found guilty if he elected to proceed to trial.\nNext, appellant argues that he did not personally answer the court's inquiry about whether there was a factual basis for the pleas, but instead his attorney answered and, therefore, this case should be reversed. He cites McDaniel v. State, 288 Ark. 629, 708 S.W.2d 613 (1986), as authority for his argument. McDaniel holds that, in a plea of guilty, the trial court shall ask the accused personally if he committed the act with which he is charged and whether he is pleading guilty because he is guilty. We do not consider the issue in this case because it was not raised in the petition for post-conviction relief. Rule 37.2(b) and (e) provide that, in order to be considered, an issue must be raised in the original or amended petition, and an issue is waived if it is not raised in the petition. We have long upheld this provision. Wiser v. State, 256 Ark. 921, 511 S.W.2d 178 (1974). Even though we do not consider the issue as it applies to this appellant, it is a subject of first impression, and one on which we have never given the trial courts any direction. *500 We take this opportunity to notify the trial bench that, from this time forward, the McDaniel rationale will be applicable to pleas of nolo contendere. Contrary to the Federal Rules of Criminal Procedure, our Rules of Criminal Procedure require a factual basis for the nolo contendere plea as well as the plea of guilty. Since the sentencing power of the court is not reduced upon the entry of the nolo plea, it is equally important to make certain that the accused actually be guilty of the offense to which the nolo plea is offered. While the insistence upon a factual basis may make the nolo plea less attractive because a disclosure of the accused's misdeeds will be made public, this is a reasonable price to pay for the assurance that the accused is not innocent of the charge for some reason. Further, it will eliminate difficulty at post-conviction proceedings in determining the accuracy of the plea, and it will be useful to the trial court in determining the sentence. The form of the personal question can be concise. After the prosecutor makes a proffer of the facts which he would prove, the judge can ask the accused: \"Are these the facts which you do not contest?\"\nAppellant next argues that he did not understand the sentence he was to receive and, therefore, his plea was not voluntarily, knowingly, and intelligently entered. The argument is without merit.\nA.R.Cr.P. Rule 24.5 provides:\nThe court shall not accept a plea of guilty or nolo contendere without first determining that the plea is voluntary. The court shall determine whether the tendered plea is the result of a plea agreement. If it is, the court shall require that the agreement be stated. The court shall also address the defendant personally and determine whether any force or threats, or any promises apart from a plea agreement, were used to induce the plea.\nThe following colloquy took place at the plea proceedings:\nTHE COURT: Is this plea of nolo contendere based upon a plea agreement, do you know what you are to receive here, today?\nMARQUETTE [APPELLANT'S ATTORNEY]: The judge wants to know if there has been some type of agreement between the prosecuting attorney's office and us?\nAPPELLANT: Yes, Sir.\nTHE COURT: What is that agreement, what are you supposed to receive?\nAPPELLANT: A sentence of a life sentence.\nTHE COURT: Two of them?\nAPPELLANT: One that I know of.\nFIELDS [PROSECUTING ATTORNEY]: Two together, right? There will be two as one.\nMARQUETTE [APPELLANT'S ATTORNEY]: There will be two sentences, they will run concurrently or at the same time; they will not be consecutive, one after the other, they will run at the same time.\nTHE COURT: Now, do you understand what you're supposed to receive?\nAPPELLANT: Yes, Sir.\nTHE COURT: And what is it?\nAPPELLANT: Life sentence.\nTHE COURT: How many?\nAPPELLANT: It could go two.\nTHE COURT: Two?\nAPPELLANT: Two?\nTHE COURT: I need it to be a little more definite.\nAPPELLANT: Two life sentences.\nTHE COURT: Do you understand that the Court doesn't have to go along with that agreement if it doesn't want to?\nAPPELLANT: Yes, Sir, I understand.\nTHE COURT: Was any force, threats, promises, coercion of any kind used against you to get you to enter these pleas?\nAPPELLANT: No, Sir, there wasn't.\nAs is readily seen, the appellant apparently did not initially understand the intricacies of how the agreement would result in him serving the equivalent of one life sentence by allowing him to serve two sentences concurrently. It is clear, however, that the trial court took pains to make *501 certain that, before the plea proceedings went further, appellant not only understood the terms of the agreement, but that he also understood that the court was not bound by the agreement. Further, the court did order that the two life sentences run concurrently. Appellant admits that he has not suffered any hardship by serving two life sentences concurrently, as opposed to one. He argues, however, that his initial misunderstanding of the terms was so obvious that the plea became involuntary and should have been rejected. He cites no cases in support of his argument, and we have found none. The requirements of A.R.Cr.P. Rule 24.5 were met.\nAppellant's final contention is that he received ineffective assistance of counsel because he was given inaccurate parole information. Appellant testified at the Rule 37 hearing that his two attorneys left him with the impression that he would be paroled in two and one-half to three years. The attorneys, however, testified that they never told appellant that he would be paroled after a certain number of years. One of the attorneys explained: \"We never specifically told him when he will be able to get out on parole. I think we told him that if his sentence were reduced to a number of years, there's a possibility that he would be eligible for parole in approximately eight to nine years.\"\nThe trial court is in the best position to resolve any conflicts in testimony. As we stated in Huff v. State, 289 Ark. 404, 711 S.W.2d 801 (1986):\nHere, the trial court was basically presented with a swearing match: appellant claimed his attorney erroneously advised him that he would be paroled within 4 or 5 years and, based on that, he entered his guilty plea. The attorney claimed [otherwise]. The trial court evidently believed the attorney. Conflicts in testimony are for the trial judge to resolve, and he is not required to believe any witness's testimony, especially the testimony of the accused since he has the most interest in the outcome of the proceedings.... We cannot say his findings are against a preponderance of the evidence.\nAffirmed.\nHICKMAN, J., concurs.\n"], ["162 F.3d 1172\n98 CJ C.A.R. 5661\nNOTICE: Although citation of unpublished opinions remains unfavored, unpublished opinions may now be cited if the opinion has persuasive value on a material issue, and a copy is attached to the citing document or, if cited in oral argument, copies are furnished to the Court and all parties. See General Order of November 29, 1993, suspending 10th Cir. Rule 36.3 until December 31, 1995, or further order.\nDavid L. BURR, Plaintiff-Appellant,v.Bill D. ROBINSON, District Court Judge, Wyandote County,State of Kansas; Carol Bacon, District Court Judge, assignedto the Kansas Court of Appeals; P.J. Peirron, Judge of theKansas Court of Appeals; R.J. Lewis, Judge of the KansasCourt of Appeals; Richard Holmes, Chief Justice of theKansas Supreme Court; Kay McFarland, Justice of the KansasSupreme Court; Tyler Lockett, Justice of the Kansas SupremeCourt; Donald Allegrucci, Justice of the Kansas SupremeCourt; Fred Six, Justice of the Kansas Supreme Court; BobAbbott, Justice of the Kansas Supreme Court; Robert Davis,Justice of the Kansas Supreme Court, Defendants-Appellees.\nNo. 98-3028.\nUnited States Court of Appeals, Tenth Circuit.\nOct. 30, 1998.\n\n1\nBALDOCK, EBEL, and MURPHY, JJ.\n\n\n2\nORDER AND JUDGMENT*\n\n\n3\nAfter examining the briefs and appellate record, this panel has determined unanimously that oral argument would not materially assist the determination of this appeal. See Fed. R.App. P. 34(a); 10th Cir.R. 34.1.9. The case is therefore ordered submitted without oral argument.\n\n\n4\nPlaintiff David L. Burr appeals the district court's order dismissing his 42 U.S.C. \u00a7 1983 complaint for lack of subject matter jurisdiction. Following consideration of the parties' briefs and review of the record, we affirm.\n\n\n5\nOn February 7, 1994, Mr. Burr filed a medical malpractice lawsuit in Kansas state court. His case was dismissed as filed outside the applicable statute of limitations. The Kansas Court of Appeals affirmed the dismissal decision, the Kansas Supreme Court denied review, and the United States Supreme Court denied certiorari. Mr. Burr then filed this action in federal court under \u00a7 1983, alleging that all of the state court judges involved in his case violated his constitutional rights of due process and access to the courts. Defendants moved for dismissal based on lack of jurisdiction, or in the alternative, failure to state a cognizable claim. The district court granted defendants' motion based on lack of subject matter jurisdiction and dismissed Mr. Burr's complaint.\n\n\n6\nWe review de novo the district court's determination that it lacked subject matter jurisdiction. See Painter v. Shalala, 97 F.3d 1351, 1355 (10th Cir.1996). It is well established that federal district courts generally do not have jurisdiction to review, reverse, or invalidate a final state-court decision. See District of Columbia Ct. of App. v. Feldman, 460 U.S. 462, 482-86, 103 S.Ct. 1303, 75 L.Ed.2d 206 (1983); Rooker v. Fidelity Trust Co., 263 U.S. 413, 415-16, 44 S.Ct. 149, 68 L.Ed. 362 (1923). The Rooker-Feldman doctrine bars \"a party losing in state court ... from seeking what in substance would be appellate review of the state judgment in a United States District Court, based on the losing party's claim that the state judgment itself violates the loser's federal rights.\" Johnson v. De Grandy, 512 U.S. 997, 1005-06, 114 S.Ct. 2647, 129 L.Ed.2d 775 (1994). Under this rule, jurisdiction to review state-court decisions lies exclusively with superior state courts and, ultimately, the United States Supreme Court. See Facio v. Jones, 929 F.2d 541, 543 (10th Cir.1991). The Rooker-Feldman doctrine bars consideration not only of issues actually presented to and decided by a state court, but also bars consideration of constitutional claims that are \" 'inextricably intertwined' with\" issues ruled upon by a state court. See id. (quoting Feldman, 460 U.S. at 483-84 n. 16).\n\n\n7\nHere, Mr. Burr filed a medical malpractice lawsuit which the state court dismissed as filed outside the applicable statute of limitations. Mr. Burr exhausted his appellate processes in the Kansas courts and in the United States Supreme Court, which is vested with exclusive jurisdiction to review a decision of the highest state court. See Facio, 929 F.2d at 543. It now appears that Mr. Burr seeks to have this court reverse the state court's judgment and declare the Supreme Court law applicable in this case unconstitutional. This we decline to do. In Facio, we held that \"Feldman not only prohibited direct review of state judgments by lower federal courts, but it also prohibited those federal courts from issuing any declaratory relief that is 'inextricably intertwined' with the state court judgment.\" Id. Here, as in Facio, unless Mr. Burr's state court dismissal is reversed, his interest in the constitutionality of the state's limitations statutes is \"prospective and hypothetical in nature,\" and he lacks standing to assert his constitutional claims Id. Therefore, Mr. Burr's request for declaratory relief is inextricably intertwined with his state court judgment, and, absent a showing of the probability of future injury, he cannot maintain his \u00a7 1983 action for declaratory relief. See id. at 544. The dismissal order against Mr. Burr is final, and any ruling that Kansas limitations statutes are unconstitutional would not reverse that judgment. See id. at 545.\n\n\n8\nMr. Burr has availed himself of every opportunity to argue his case to the state appellate courts and to the United States Supreme Court. Thus, he has had all the process due him in this matter. See id. (proper avenue for review of state court judgments is state court and certiorari review by the United States Supreme Court); Anderson v. Colorado, 793 F.2d 262, 263 (10th Cir.1986) (where the state appellate process is available, \"a litigant may not seek to reverse or modify the state court judgment by bringing a constitutional claim under 42 U.S.C. \u00a7 1983.\"). Therefore, we conclude that the district court was correct in finding that, under the Rooker-Feldman doctrine, it lacked subject matter jurisdiction to consider Mr. Burr's \u00a7 1983 claims. The judgment of the United States District Court for the District of Kansas is AFFIRMED.\n\n\n\n*\n This order and judgment is not binding precedent, except under the doctrines of law of the case, res judicata, and collateral estoppel. The court generally disfavors the citation of orders and judgments; nevertheless, an order and judgment may be cited under the terms and conditions of 10th Cir.R. 36.3\n\n\n"], ["\n621 F.Supp.2d 337 (2009)\nUNITED STATES of America\nv.\nRobert Franklin DOYLE, Jr., Defendant.\nNo. 2:07CR00004.\nUnited States District Court, W.D. Virginia, Big Stone Gap Division.\nJune 2, 2009.\n*339 Jennifer R. Bockhorst, Assistant United States Attorney, Abingdon, VA, and Samuel E. Fishel, IV, Special Assistant United States Attorney, Richmond, VA, for United States.\nJohn E. Jessee, Jessee & Read, P. C., Abingdon, VA, for Defendant.\n\nOPINION AND ORDER\nJAMES P. JONES, Chief Judge.\nIn this criminal case, the defendant, convicted by a jury of possessing child pornography, has filed post-trial motions seeking acquittal and a new trial. For the reasons that follow, I will deny the motions.\n\nI\nThe defendant Robert Doyle was convicted by a jury of knowingly receiving and knowingly possessing child pornography in violation of 18 U.S.C.A. \u00a7\u00a7 2252A(a)(2)(A), (a)(5)(B), and (b)(2) (West Supp.2008) (Counts One and Two), and knowingly transporting child pornography in violation of 18 U.S.C.A. \u00a7\u00a7 2252A(a)(1) and (b)(1) (West Supp.2008) (Counts Three, Four, and Five). At trial, the government contended that the defendant had used a desktop computer located in his bedroom to download images of child pornography from the internet. The defendant argued that since other people had had access to the computer, the government did not prove beyond a reasonable doubt that the defendant was the person who had downloaded the images. The defendant also asserted that the government did not meet its burden of proving that the images depicted real children under the age of eighteen.\nIn his post-trial Motion for a Judgment of Acquittal, the defendant raises these arguments once more, contending that the government presented insufficient evidence for the jury to find beyond a reasonable doubt that the defendant was the person who downloaded the images and that the images depicted real children under the age of eighteen. The defendant also argues that the government did not establish that the Western District of Virginia was the proper venue for the three counts of transporting child pornography.\nIn his Motion for New Trial, the defendant argues that the testimony of a deceased witness given previously at a bond hearing should not have been admitted at trial under Federal Rule of Evidence 804(b)(1) and the Sixth Amendment's Confrontation Clause because the defendant did not have a similar motive to cross-examine the witness during the bond hearing. The defendant's motions have been briefed and argued and are ripe for decision.\n\nII\nThe evidence adduced at trial was sufficient for a reasonable jury to convict the defendant beyond a reasonable doubt, and the government presented sufficient evidence that the Western District of Virginia was the proper venue for all counts. Therefore, I will deny the defendant's Motion for a Judgment of Acquittal.\nThe defendant argues that the government submitted insufficient evidence for a jury to convict him of the crimes charged. Specifically, the defendant claims that there was insufficient evidence that the defendant was the person who downloaded the images of child pornography and that the images depicted real children under the age of eighteen. A conviction must be sustained if, viewed in the light most favorable to the government, there is substantial evidence to support it. Glasser v. United States, 315 U.S. *340 60, 80, 62 S.Ct. 457, 86 L.Ed. 680 (1942), superseded by statute on other grounds as recognized in Bourjaily v. United States, 483 U.S. 171, 177-78, 107 S.Ct. 2775, 97 L.Ed.2d 144 (1987). I must determine \"whether any rational trier of fact could have found the essential elements of the crime charged beyond a reasonable doubt.\" United States v. Capers, 61 F.3d 1100, 1107 (4th Cir.1995) (internal quotation marks and alternations omitted).\nThe government presented sufficient evidence that the defendant was the person who downloaded the images of child pornography. Other individuals did testify to having used the computer on which the images were found. However, the twenty-six offensive images admitted into evidence were all created and accessed exclusively between 6:21 p.m. and 1:53 a.m., and twenty-four of those images were only accessed after 9:18 p.m.[1] The three emails sent from \"bobby\" to [2] or with images of child pornography attached were time marked 9:02 p.m., 9:40 p.m., and 9:41 p.m.[3] The computer was located in the defendant's bedroom, evidence that he was the most likely person to have had access to the computer late at night. A jury verdict may be based in whole or in part on circumstantial evidence, Holland v. United States, 348 U.S. 121, 140, 75 S.Ct. 127, 99 L.Ed. 150 (1954), and this evidence was sufficient for the jury to conclude the defendant was the person accessing and transporting the images.\nThe government introduced the pornographic images as the only evidence that those images depicted real children under the age of eighteen, but such evidence was sufficient. Under Ashcroft v. Free Speech Coalition, 535 U.S. 234, 251-56, 122 S.Ct. 1389, 152 L.Ed.2d 403 (2002), pornographic images of \"virtual\" children are protected free speech. The government therefore had the burden to prove beyond a reasonable doubt that the images in this case depicted real children. Although the Fourth Circuit has yet to rule upon the issue, other circuits have concluded that images themselves may be sufficient evidence for a jury to conclude that real children are depicted. United States v. Salcido, 506 F.3d 729, 733-34 (9th Cir. 2007); United States v. Rodriguez-Pacheco, 475 F.3d 434, 437 (1st Cir.2007); United States v. Irving, 452 F.3d 110, 121-22 (2d Cir.2006); United States v. Farrelly, 389 F.3d 649, 654 & n. 4 (6th Cir.2004), abrogated on other grounds by United States v. Williams, 411 F.3d 675, 678 n. 1 (6th Cir.2005); United States v. Slanina, 359 F.3d 356, 357 (5th Cir.2004); United States v. Kimler, 335 F.3d 1132, 1142 (10th Cir.2003); United States v. Deaton, 328 F.3d 454, 455 (8th Cir.2003). The jury is capable of distinguishing for itself whether a child depicted in an image is real or virtual. Salcido, 506 F.3d at 733-34. Based on my review of the images in this case, I find that these images alone were *341 sufficient evidence for the jury to conclude that the images depicted real children under the age of eighteen.[4]\nThe government also presented sufficient evidence that all of the offenses, including the three counts of transporting child pornography, occurred in the Western District of Virginia. Evidence at trial showed that the defendant resided in a house in Rose Hill, Virginia, which is in this district, from August 2003 to January 2004. The defendant's niece testified that she had visited the defendant's residence at least twice per week from August to December 2003, and that there had been a black Dell computer located in the defendant's bedroom during that time period. This testimony was corroborated by Fred Rouse, an investigator for the Lee County Sheriff's Department. He testified that officers had seized a black Dell computer from the defendant's bedroom on January 9, 2004, during the execution of a search warrant at the defendant's residence. It was on this computer, identified by Investigator Rouse during his testimony at trial, that the offending images were found. Special Agent Chris Hoy, who forensically examined the black Dell computer, testified that the operating system had been registered to \"Bobby Doyle\" in May 2003.\nI find that when viewed in the light most favorable to the government, there was sufficient evidence from which the jury could conclude that images of child pornography were transported, received, and possessed by the defendant on the black Dell computer in his residence in this judicial district during the time period in question. Thus, the defendant's Motion for a Judgment of Acquittal will be denied.\n\nIII\nI will also deny the defendant's Motion for New Trial because the testimony of the unavailable witness was properly admitted at trial. The defendant argues that a new trial should be granted because the admission of prior testimony of a deceased witness violated Federal Rule of Evidence 804(b)(1) and the defendant's Sixth Amendment right to confront witnesses against him. This court has the discretion to grant a new trial where the interests of justice so require. Fed. R.Crim.P. 33(a); United States v. Mitchell, 602 F.2d 636, 639 (4th Cir.1979).\nThe Sixth Amendment's Confrontation Clause provides that, \"[i]n all criminal prosecutions, the accused shall enjoy the right . . . to be confronted with the witnesses against him.\" U.S. Const. amend. VI. Prior testimony may therefore only be admitted if the declarant is unavailable and the defendant had a prior opportunity for cross-examination. Crawford v. Washington, 541 U.S. 36, 68, 124 S.Ct. 1354, 158 L.Ed.2d 177 (2004). Federal Rule of Evidence 804(b)(1) permits the admission of prior testimony of an unavailable witness only where the \"party against whom the testimony is now offered. . . had an opportunity and similar motive to develop the testimony by direct, cross, or redirect examination.\" \"`[S]imilar motive' does not mean `identical motive.'\" United States v. Salerno, 505 U.S. 317, 326, 112 S.Ct. 2503, 120 L.Ed.2d 255 (1992) (Blackmun, J., concurring). \"[T]he similar-motive inquiry . . . is inherently a factual inquiry, depending in part on the similarity of the underlying issues and on the context of the . . . questioning.\" Id.\n*342 Silas Glass testified at the defendant's bond hearing before a magistrate judge of this court on March 9, 2007. The issues at the hearing included whether the defendant posed a danger to the community and whether conditions of release might reasonably assure his appearance at trial. One factor the court was required to consider was the weight of the evidence against the defendant. 18 U.S.C.A. \u00a7 3142(g)(2) (West 2000 & Supp.2005). Doyle's attorney called Glass to discuss the defendant's work history with a road construction and highway maintenance company, Glass Machinery Excavation. Glass testified that the defendant had been hired initially in 1991 as a truck driver and a part-time mechanic. Over the previous three or five years, however, the defendant had been employed primarily to look after Glass and his wife, who both had physical ailments. The defendant drove Glass to pick up parts for the business. He took Glass or his wife grocery shopping every Friday, and occasionally drove Glass to doctors' appointments. He sometimes came by Glass's home in the evening to see if Glass needed anything. The defendant also assisted with whatever needed to be done at the company, such as driving trucks, performing mechanical work, and running errands.\nDuring cross-examination by the government, Glass revealed that although the defendant had generally worked at the store from 8:00 a.m. to 5:00 p.m. five days per week, he would leave during the day to run errands as needed. Glass estimated that the defendant had spent ten to fifteen hours per week running personal errands for Glass and his wife.\nUnfortunately, Glass passed away prior to the commencement of the defendant's trial and was therefore unavailable to testify. During the trial, the defendant called Glass's son and daughter-in-law to testify in support of his alibi defense. Responding to some discrepancies between their testimony and Glass's prior testimony, the government cross-examined both witnesses with portions of Glass's testimony regarding the defendant's work history and duties. During its rebuttal case, I also permitted the government to read portions of Glass's testimony to the jury over the defendant's objection.\nThe defendant argues that the testimony of Glass was inadmissible because the defendant did not have a similar motive to examine Glass on direct and redirect during the bond hearing as he would have had during cross-examination at trial. I find that the defendant's motive for direct examination and redirect examination at the bond hearing, although not identical, was substantially similar to what his motivation would have been during cross-examination at trial. Thus, the testimony of Glass was properly admitted.\nThe main issue at the bond hearing was the safety of the community if the defendant were released pending trial. See United States v. Doyle, No. 2:07CR00004, 2007 WL 1097844, at *1 (W.D.Va. Apr. 11, 2007) (upholding the magistrate judge's detention order because no condition of release would ensure the safety of the community). On direct examination, Glass's testimony showed that the defendant was working and was accounted for during most weekdays. The government's cross-examination of Glass revealed that the defendant was, in fact, not accounted for during large portions of the day. The clear implication was that during those hours, the defendant could engage in criminal behavior, such as receiving and transporting child pornography. The defendant had a fair opportunity to examine Glass on direct and redirect, if desired. If the defendant could have been accounted for during all hours of the day and night while *343 employed by Glass Machinery, it would have been to the defendant's advantage to bring forth that testimony from Glass during the bond hearing. If Glass's testimony about the defendant's general working hours and duties was false or the result of misrecollection or confusion, the defendant had a full and fair opportunity to examine Glass to bring the matter to the court's attention.\nIf Glass had been available to testify at trial, the defendant presumably would have liked to cross-examine him regarding the number of hours the defendant worked each day in Glass's presence, the amount of time the defendant spent running errands each week, and to what extent the defendant worked in the evenings. However, the defendant had a motive and opportunity to question Glass on these issues during direct and redirect examination at the bond hearing. At the bond hearing, the defendant's daily whereabouts made it more or less probable that he would be a danger to the community during pre-trial release and was probative as to whether he was able to commit the crimes charged and whether he would have the opportunity to commit additional crimes upon release. At trial, the defendant's daily whereabouts made it more or less probable that he had a consistent alibi. Although the underlying issues at the bond hearing and at trial were not identical, they were sufficiently similar so that the defendant had an adequate opportunity to confront and examine the witness.\nFederal courts have admitted prior testimony from preliminary hearings, even though such hearings are intended merely to show probable cause, not proof of guilt beyond a reasonable doubt. Glenn v. Dallman, 635 F.2d 1183, 1187 (6th Cir. 1980) (\"The fact remains that while petitioner's counsel did not exercise her opportunity to fully cross examine the witness, she still had that opportunity.\"); United States ex rel. Haywood v. Wolff, 658 F.2d 455, 461 (7th Cir. 1981) (noting that the opportunity for cross-examination afforded at the preliminary hearing need not be identical to that required at trial) contra People v. Fry, 92 P.3d 970, 980 (Colo.2004) (concluding that admission of prior testimony elicited during a state court preliminary hearing violated the defendant's Sixth Amendment right to confront witnesses against him).\nCourts have also admitted prior testimony elicited during other types of hearings, such as a bond hearing, State v. Douglas, 310 Or. 438, 800 P.2d 288, 293 (1990) (finding that the defendant had a similar motive to cross-examine the unavailable witnesses during a prior security release hearing), a sanity hearing, McMurrey v. State, 145 Tex.Crim. 439, 168 S.W.2d 858, 861 (1943), a committal hearing, Barnes v. State, 256 Ga. 370, 349 S.E.2d 387, 388 (1986), a suppression hearing, United States v. Poland, 659 F.2d 884, 896 (9th Cir.1981); Williams v. State, 214 Ga.App. 280, 447 S.E.2d 676, 678 (1994), an adult certification hearing, Coffin v. State, 850 S.W.2d 608, 610-11 (Tex.App.1993), an extradition hearing, Prater v. State, 148 Ga. App. 831, 253 S.E.2d 223, 229 (1979), and a grand jury proceeding where the testimony was later sought to be admitted by the defendant, United States v. McFall, 558 F.3d 951, 963 (9th Cir.2009) (noting that Rule 804(b)(1) \"does not require an identical quantum of motivation\").\nThe defendant cites People v. Brown, 374 Ill.App.3d 726, 312 Ill.Dec. 589, 870 N.E.2d 1033 (2007), in support of his contention that a defendant's motive to cross-examine a witness at a bond hearing is not sufficiently similar to the motive during trial. But Brown is clearly distinguishable. In that case, the sole issue before the court during the prior hearing was *344 whether the defendant had violated a condition of release. The court inappropriately expanded the scope of the bond violation hearing by permitting the government to question the witness about the underlying offense, and the defendant had no motive to cross-examine the witness regarding those facts since they were not at all relevant to the issue at hand. The Illinois court therefore held that the testimony of the witness from the bond violation hearing was impermissibly admitted at the defendant's subsequent trial. Id. at 1039.\nOther cases cited in Brown are also distinguishable on the facts. See People v. Vera, 153 Mich.App. 411, 395 N.W.2d 339, 341 (1986) (affirming trial court's exclusion of testimony from a preliminary hearing where the government did not have a motive to cross-examine the witness regarding a statement she made while answering questions relevant only to the proper amount of bond); Dickson v. State, 281 Ga.App. 539, 636 S.E.2d 721, 724 (2006) (concluding that the trial court erred in admitting an audiotape of the deceased witness's interview with an investigator, finding that it was not adequate that the defendant had an opportunity to cross-examine that witness during a bond hearing).\n\"[T]he similar-motive inquiry . . . is inherently a factual inquiry, depending in part on the similarity of the underlying issues and on the context of the . . . questioning.\" Salerno, 505 U.S. at 326, 112 S.Ct. 2503. The facts in this case show that the defendant had a similar motive to question Glass about the defendant's working hours at both the bond hearing and the subsequent trial. Therefore, Glass's prior testimony was properly admitted under Federal Rule of Evidence 804(b)(1), and its admission did not violate the Sixth Amendment's Confrontation Clause.\nEven if the testimony had been admitted in error, it would not be in the interests of justice to grant the motion for a new trial.[5] First, the jury had already heard about the discrepancies between Glass's prior testimony and the testimony of his son and daughter-in-law at trial through the government's proper cross-examination of those witnesses. The admission of Glass's testimony from the bond hearing was therefore cumulative of other admissible evidence on that issue.\nSecond, any dispute about the defendant's duties at Glass Machinery was minor and could not have substantially swayed the jury's verdict. Glass's son testified that Glass Machinery performed snow removal for the Virginia Department of Transportation, and that the defendant had assisted with snow removal at night. The defendant's work records, which were admitted into evidence, specified the dates on which he performed snow removal and the total number of hours he worked on each date, but did not specify the time of day work was completed. Glass's son indicated that he could have determined the *345 time of day the work was completed by consulting \"[t]he daily field sheet,\" but that he did not bring that information to court and could not say what time the defendant's work was conducted. (Trial Tr. vol. 3, 6, Dec. 10, 2008.) Glass's son also testified that the defendant had occasionally assisted Glass with personal errands during evening hours, and that Glass and the defendant had taken several overnight fishing trips together on weekends, but he did not specify which weekends.[6]\nAlthough there were some differences between Glass's testimony at the bond hearing and his son's testimony at trial, the most important fact relevant to the defendant's alibi defense was not in dispute: the defendant occasionally worked during evening hours.[7] The jury heard a portion of Glass's testimony where he affirmatively stated that this was so. (Trial Tr. vol. 1, 9, Dec. 10, 2008) (\"A lot of times he sees I get home all right, he comes home in the evening after I get home to see if I need anything.\").\nOn the other hand, there was no direct evidence that the defendant was working in the evening or was out of town on a fishing trip on the specific dates that the offending images were accessed. Considered in its totality, along with all of the other evidence in the case, Glass's testimony could not have substantially swayed the jury's conclusion that the defendant was available to access and email the images of child pornography during evening and late-night hours on the relevant dates.\nThird, as described above, the government presented substantial evidence that the defendant was the person who emailed, downloaded, and accessed the images of child pornography.\n\nIV\nFor the foregoing reasons, it is hereby ORDERED that the defendant's Motion for a Judgment of Acquittal and Motion for New Trial are DENIED.\nNOTES\n[1] The images admitted into evidence included time stamps for when they were created, modified, last written, and last accessed. Testimony at trial showed that the computer on which the images were found was set to Central Standard Time, which was reflected in the time stamps. The times listed here have been converted into Eastern Standard Time.\n[2] Evidence at trial showed that the email address bobbydva@yahoo.com was registered to Mr. Bobby Doyle. (Gov't Ex. 15.) The Yahoo! account for bobbydva@yahoo.com listed rfdj 1@hotmail.com as an alternate email address. (Id.) The government argued that the defendant emailed the images to himself for safekeeping.\n[3] The time stamps on the email messages reflected the correct hour in Eastern Standard Time.\n[4] The jury in this case was instructed, \"It is not necessary that the Government introduce direct evidence of the age of the persons depicted and the jury may consider all of the evidence in the case, including the visual depictions themselves, in determining whether the persons depicted were minors.\" (Jury Instruction No. 11.)\n[5] \"Any error . . . that does not affect substantial rights must be disregarded.\" Fed. R.Crim.P. 52(a). A Confrontation Clause violation may be found harmless. United States v. Banks, 482 F.3d 733, 741 (4th Cir.2007) (citing United States v. Khan, 461 F.3d 477, 496 (4th Cir.2006)). The Fourth Circuit will find an error harmless if it is \"able to say with fair assurance, after pondering all that has happened without stripping the erroneous action from the whole, that the judgment was not substantially swayed by the error.\" Id. at 741-42 (quoting United States v. Brooks, 111 F.3d 365, 371 (4th Cir.1997)). See id. at 742 (finding Crawford error harmless after considering importance of erroneously admitted testimonial evidence and strength of government's case); Khan, 461 F.3d at 496 (same); United States v. Bryant, No. 06-4977, 2008 WL 5070972, at *5 (4th Cir. Nov. 25, 2008) (unpublished) (same).\n[6] Glass's daughter-in-law also testified that Glass and the defendant had taken weekend fishing trips, but she could not point to which weekends.\n[7] This was relevant because, as described above, all of the twenty-six offending images were accessed between 6:21 p.m. and 1:53 a.m., and all but two were emailed or accessed after 9:02 p.m.\n"], [" MEMORANDUM OPINION\n No. 04-12-00200-CV\n\n Kandy E. SWAIM,\n Appellant\n\n v.\n\n Robert M. SWAIM, Jr.,\n Appellee\n\n From the 406th Judicial District Court, Webb County, Texas\n Trial Court No. 2009-CVG-000185-D4\n Honorable Oscar J. Hale, Jr., Judge Presiding\n\nPER CURIAM\n\nSitting: Karen Angelini, Justice\n Sandee Bryan Marion, Justice\n Phylis J. Speedlin, Justice\n\nDelivered and Filed: October 31, 2012\n\nDISMISSED\n\n Appellant has filed a motion to dismiss this appeal, stating that both appellant and\n\nappellee are informing the court that they have settled their dispute. Therefore, we grant the\n\nmotion and dismiss the appeal. See TEX. R. APP. P. 42.1(a). Costs of appeal are taxed against\n\nappellant. See id. 42.1(d).\n\n\n PER CURIAM\n\f"], ["424 F.2d 633\nHESS SHIPPING CORPORATION, Plaintiff-Appellee,v.The SS CHARLES LYKES, in rem and Lykes Bros. Steamship Co.,Inc.,Defendant-Appellant.LYKES BROS. STEAMSHIP CO., Inc., Plaintiff-Appellant,v.HESS SHIPPING CORPORATION and the ST. HESS VOYAGER, in rem,Defendant-Appellee.\nNo. 26703.\nUnited States Court of Appeals, Fifth Circuit.\nApril 28, 1970, Rehearing En Banc Denied June 17, 1970.\n\nGeorge F. Wood, Mobile, Ala., Benjamin Yancey, Edward S. Bagley, New Orleans, La., for appellant.\nTheodore K. Jackson, Jr., Rae M. Crowe, Mobile, Ala., for appellee.\nBefore JOHN R. BROWN, Chief Judge, and WISDOM, GEWIN, BELL, THORNBERRY, COLEMAN, GOLDBERG, AINSWORTH, GODBOLD, DYER, SIMPSON, MORGAN, CLARK, and INGRAHAM, Circuit Judges.\nPER CURIAM:\n\n\n1\nOn rehearing of this case en banc the court stands evenly divided, therefore the judgment of affirmance is adhered to by operation of law.\n\n\n2\nON PETITION FOR REHEARING EN BANC OF THE EN BANC DECISION\n\n\n3\nNo member of this panel nor Judge in regular active service on the Court having requested that the Court be polled on the Petition for Rehearing of the En Banc decision (Rule 35 Federal Rules of Appellate Procedure; Local Fifth Circuit Rule 12) the Petition for such Rehearing En Banc with oral argument is denied.\n\n"], ["\n170 Ga. App. 471 (1984)\n317 S.E.2d 295\nTHE STATE\nv.\nLESTER.\n67531.\nCourt of Appeals of Georgia.\nDecided March 9, 1984.\nRehearing Denied March 26, 1984.\nW. Bryant Huff, District Attorney, Johnny R. Moore, Assistant District Attorney, for appellant.\nWalter M. Britt, for appellee.\nBIRDSONG, Judge.\nMotion to Quash. John F. Lester, Judge of Recorder's Court of Gwinnett County, has been charged with malpractice in office. The State obtained an indictment containing five counts, only two of which are of concern in this appeal. In Count I, Judge Lester is accused of improperly entering nolle prosequi action on 100 traffic violations, but at the same time accepting, on behalf of the county, costs of court and payment of fines for serious traffic offenses such as driving under the influence of an intoxicant and similar charges that possibly could result in the suspension and/or revocation of licenses to operate vehicles in this state. As a result of the nolle prosequi action, notification to the Department of Public Safety was not accomplished for its consideration of the traffic violation and the purpose of OCGA \u00a7 40-13-3 ostensibly was thwarted. As to Count IV, Judge Lester is accused of accepting pleas of guilty to lesser offenses (pedestrain drunk or public drunk in lieu of driving under influence) by defendants who were not present in court, in 243 traffic cases, and of accepting costs and payment of fines based upon the lesser offenses. Once again the State alleges that the result of these guilty pleas was to thwart the purpose of OCGA \u00a7 40-13-3.\nJudge Lester entered a motion to quash 24 of the 100 cases in which nolle prosequi had been entered and 91 of the 243 cases in which conviction of a lesser offense had been entered on the ground that the action taken by the judge in each of these 115 cases had occurred over two years before the date of the indictment. Discovery of these actions was based upon an audit by the GBI which reflected the course of conduct followed by the recorder in his court. It is contended by appellant and fully conceded by the State that malpractice in office is a misdemeanor and that the appropriate statute of limitations applicable thereto is two years. The trial court quashed those parts of Counts I and IV which reflected actions taken by the recorder which predated the indictment (and the August, 1983, audit by the GBI) by more than two years. The State appealed, complaining of the trial court's quashing of these portions of the apparently otherwise *472 validly constituted indictment. Held:\nIt is not contended by the State that any of the four exclusions provided in OCGA \u00a7 17-3-2 are applicable so as to remove the two year statute of limitations imposed by OCGA \u00a7 17-3-1 (d). Rather, the State argues that as to the nolle prosequi entered by the recorder in the 100 cases that are the subject of Count I, a nolle prosequi is not necessarily a final termination of the case (depending upon the attachment of jeopardy), and the recorder by his improper actions entered a void judgment in each case. As such, the cases are still viable and thus the offense is a continuing one. As to the reduction of the driving under the influence to the lesser offenses of public drunk or pedestrian drunk in Count IV, each of these convictions was based upon an allegedly improper plea of guilty in that the defendant was never advised of the implications of his plea and the recorder never satisfied himself of the providency of the pleas of guilty. Again the State argues that each of these guilty pleas constituted a void judgment. As such, each of these cases remains pending and therefore the malpractice is a continuing offense.\nThe trial court rejected this argument, concluding that there was a final action in each of the 343 cases and that 115 of the misdemeanors were subject to the two-year statute of limitations. Inasmuch as the State had pled no exception to the operation of the statute, the court accordingly quashed those subject to the statute. State v. Shepherd Constr. Co., 248 Ga. 1, 4 (II (a)) (281 SE2d 151).\nThere is no basis on which to overturn the judgment of the trial court. In Holloman v. State, 133 Ga. App. 275, 277 (211 SE2d 312), the State made an argument similar to the one advanced in this case. In that case the County Commissioners of Jones County entered into a contract for the renovation of the county courthouse. It was shown that the commissioners entered into a cost plus a fixed percentage fee contract which violated the pertinent statute requiring such a contract to be by bid and awarded to the lowest bidder. In substance, the State argued in Holloman that the contract took many months to complete and thus the malpractice of the county commissioners was a continuing offense until either a legal contract was executed or until the illegal contract actually was completed. This court was not persuaded by that argument. The court found that the violation of the statute occurred when the contract was awarded in March, 1969, and did not continue until the contract was concluded perhaps as late as early 1972. This court did not contemplate the argument that a contract executed in violation of law was a void contract and thus subject to proper execution at any time prior to the final completion of the contract. To the contrary, this court concluded that the gravamen of the offense was the making of a cost plus contract in March, 1969, without compliance with the statutory requirements, and not what *473 was or was not done thereafter. Holloman, supra, p. 277.\nIn like manner, we conclude, as did the trial court herein, that the recorder's acts were committed at the time he entered a nolle prosequi or when he accepted a plea of guilty and entered the disposition on the traffic citation. Whether the action was lawful or unlawful, it was a final disposition of each case. If the action was authorized, there was no malpractice. If unlawful, there may have been malpractice, but as to the recipient of the action (the defendant in each case), the action became final. See State v. Hanson, 249 Ga. 739, 747 (295 SE2d 297).\nThe burden unquestionably is upon the State to prove that a crime occurred within the statute of limitation or to prove that the case properly falls within an exception. Williams v. State, 13 Ga. App. 338 (1) (79 SE 207). The fact that the issue is determined pretrial does not relieve the State of this burden.\nUnder the facts of this case, the trial court correctly concluded that all of the traffic cases in question became final on the date that the recorder's court entered its judgment of nolle prosequi or accepted a plea of guilty to a lesser offense and that result formally was entered by the court as its disposition of the case. Thus, as to the 115 cases where such action occurred more than two years before the audit occurred and an indictment was returned, the statute had run and those incidents are barred. State v. Tuzman, 145 Ga. App. 481, 484 (243 SE2d 675).\nJudgment affirmed. Quillian, P. J., concurs. Carley, J., concurs in the judgment only.\n"], [" NOT RECOMMENDED FOR FULL-TEXT PUBLICATION\n File Name: 15a0443n.06\n\n No. 13-5414\n FILED\n UNITED STATES COURT OF APPEALS Jun 12, 2015\n FOR THE SIXTH CIRCUIT DEBORAH S. HUNT, Clerk\n\nHOWARD WALTER THOMAS, )\n )\n Petitioner-Appellant, )\n )\nv. ) ON APPEAL FROM THE\n ) UNITED STATES DISTRICT\nSTANTON HEIDLE, ) COURT FOR THE EASTERN\n ) DISTRICT OF TENNESSEE\n Respondent-Appellee. )\n )\n )\n\n\n BEFORE: GIBBONS and COOK, Circuit Judges; GWIN, District Judge.*\n\n GIBBONS, J., delivered the opinion of the court in which COOK, J., joined in full, and\nGWIN, D.J., joined except for Part IV.C.\n\n JULIA SMITH GIBBONS, Circuit Judge. Howard Walter Thomas appeals the district\n\ncourt\u2019s denial of his petition for a writ of habeas corpus seeking relief from his convictions in\n\nTennessee for murder, attempted murder, robbery, and kidnapping. This court granted a\n\ncertificate of appealability as to a single issue: whether the state court erred when it excluded the\n\ntestimony of a defense expert who planned to testify to some of the potential flaws\u2014and the\n\nscientific bases of the flaws\u2014in an eyewitness\u2019s identification of Thomas as the perpetrator. In\n\nexcluding the expert evidence, the trial court relied on then-existing state precedent\u2014since\n\noverruled\u2014that effectively made expert testimony on eyewitness identification per se\n\n\n *\n The Honorable James S. Gwin, United States District Judge, United States District Court for the Northern\nDistrict of Ohio, sitting by designation.\n\fNo. 13-5414\nThomas v. Heidle\n\ninadmissible. Thomas challenges the exclusion as a violation of his rights under the Due Process\n\nClause of the Fourteenth Amendment, and/or the Compulsory Process or Confrontation clauses\n\nof the Sixth Amendment, pursuant to Chambers v. Mississippi, 410 U.S. 284, 302 (1973), and its\n\nprogeny. We hold that the exclusion did not amount to a violation of clearly established federal\n\nlaw. Even if it had, any error would have been harmless. We therefore affirm.\n\n I.\n\n John and Yvonne Cook were driving their van from Wisconsin to vacation in North\n\nCarolina on the morning of March 23, 1991.1 Yvonne was asleep on a bedroll in the back of the\n\nvan. She woke up when John pulled off the interstate in Knoxville, Tennessee. It was morning\n\nand was raining heavily. Once the van was stationary, John turned on the map light and reached\n\nfor the atlas on the dashboard. Yvonne heard him say, \u201cOh my God,\u201d followed by an explosion\n\nand a \u201cpop.\u201d He slumped toward her in the seat and she noticed that he was bleeding heavily,\n\nmostly from his ear.\n\n The assailant broke the window and opened the driver-side door, causing the van\u2019s dome\n\nlights to switch on. He began pushing John out of the driver\u2019s seat. Yvonne, sitting on the floor\n\nalmost between the two front seats, unlatched John\u2019s seatbelt. The assailant pushed John into\n\nYvonne\u2019s arms and took his place in the driver\u2019s seat. Yvonne had a \u201cfull facial view\u201d of the\n\nman as he got into the driver\u2019s seat. Now sitting mere inches from the front seat, she noticed that\n\nthe man had laid a gun across his lap with the barrel pointing at her head.\n\n\n\n\n 1\n Except where otherwise indicated, the facts presented here are those set forth by the Court of Criminal\nAppeals of Tennessee in Thomas v. State, 298 S.W.3d 610 (Tenn. Crim. App. 2009), which cross-refers to that\ncourt\u2019s previous decision in State v. Thomas, No. E2003-02090-CCA-R3-CD, 2005 WL 735040 (Tenn. Crim. App.\nMar. 30, 2005). This court \u201caccept[s] the state court\u2019s determination of a factual issue unless the petitioner upsets\nthe presumption by clear and convincing evidence.\u201d Jackson v. Bradshaw, 681 F.3d 753, 759 (6th Cir. 2012).\nThomas does not challenge the state court\u2019s factual findings in this appeal.\n\n -2-\n\fNo. 13-5414\nThomas v. Heidle\n\n The man drove onto the interstate and drove speedily for fifteen to twenty minutes. He\n\nrefused Yvonne\u2019s request to release them so that she could find medical help for John. By\n\nkeeping pressure on John\u2019s neck, she hoped to keep him alive. Despite her efforts, she felt his\n\nheart stop beating after around ten minutes. When she again begged the assailant to stop, he\n\nsaid, \u201c[D]on\u2019t look at me. I\u2019m going to kill you, too.\u201d He exited the interstate onto a two-lane\n\nrural road before pulling over on the side of the road and demanding cash from Yvonne. She\n\nreached into her purse and retrieved an assortment of ten- and twenty-dollar bills from John\u2019s\n\nwallet, totaling somewhere between five hundred and a thousand dollars. She handed the money\n\nover to the assailant, her hands covered with her husband\u2019s blood.\n\n The driver ordered Yvonne to exit the car. He got out of the driver\u2019s seat, walked around\n\nto the back of the van, and dumped John\u2019s then-lifeless body onto the ground. \u201cIt was beginning\n\nto get light\u201d outside by this time. Yvonne stood up right next to the assailant. He ordered her\n\nnot to look at him and to get down on her hands and knees. He tried to load his gun while\n\npointing it at her head. He struggled to do so because he was wearing gloves. When Yvonne\n\nbegged for her life, he told her, \u201cI\u2019m not going to rot in some fucking prison because you can\n\nidentify me.\u201d At that time, another car came down the road, its lights shining on Yvonne. This\n\ncaused the assailant to flee in the van.\n\n She described these events to Knoxville Police Department (KPD) officers who soon\n\narrived on the scene. Yvonne also tried to retrace the scene while riding with them. Later, at the\n\npolice station, she gave a detailed description of the attacker. She said that he was very young,\n\nno older than twenty-two and perhaps younger than seventeen. She believed he was around five\n\nfeet, five inches tall. He had a \u201cbandanna-type cloth\u201d over his head, down to his nose, and had\n\n\n\n\n -3-\n\fNo. 13-5414\nThomas v. Heidle\n\nmedium brown hair with around three inches hanging down at the back. Police found the van the\n\nsame day but did not recover any fingerprints and did not locate a suspect.\n\n After Yvonne returned home to Wisconsin, a psychiatrist suggested that she undergo\n\nhypnosis. She did so in order to \u201chelp do something to solve the case.\u201d She explained that the\n\nhypnosis put her in a state of \u201cdeep concentration and relaxation,\u201d but that it did not change her\n\nmental image of the assailant. After the hypnosis, a police sketch artist drew a composite of the\n\nsuspect based on Yvonne\u2019s description. She concluded that the sketch did not \u201clook exactly like\n\nthe person,\u201d rating it seven out of ten for its likeness to the assailant, but \u201cbelieved it was the best\n\nthat we could do.\u201d During the year after John\u2019s death, she also viewed several photographic and\n\nvideo lineups, none of which contained Thomas\u2019s picture. She noted some similarities in\n\nappearance, but did not positively identify anyone.\n\n In 2000, KPD officers interviewed a resident of Atlanta, Georgia, by the name of Mary\n\nStorm. She provided the names of two suspects in Knoxville: Howard Walter Thomas and\n\nErnest Salyer. Officers spoke to Salyer and, based on his statement, arrested Thomas and\n\ncharged him with felony-murder. In May 2000, the KPD told Yvonne they had arrested a\n\nsuspect. A friend then sent her a report from a Knoxville newspaper of the arrest. Directly\n\nbeneath a picture of John was a picture of Thomas as he appeared in 2000.\n\n Thomas was indicted in Tennessee state court for first-degree premeditated murder, first-\n\ndegree felony murder, especially aggravated robbery, especially aggravated kidnapping, and\n\nattempted first-degree murder. The state filed notice of intent to seek the death penalty.\n\n Before trial, Thomas moved to suppress Yvonne\u2019s identification testimony on the bases\n\nthat it was influenced by hypnosis and that it was the product of an overly suggestive\n\nidentification. The court denied the motion after an evidentiary hearing but permitted an\n\n\n -4-\n\fNo. 13-5414\nThomas v. Heidle\n\ninterlocutory appeal. But the Tennessee Court of Criminal Appeals denied the application for\n\ninterlocutory appeal and the Tennessee Supreme Court declined to review the case.\n\n Thomas also gave pre-trial notice of his intent to introduce testimony from Dr. Elizabeth\n\nLoftus, an expert in eyewitness identification. After Loftus furnished a report and testified at a\n\nhearing, the court excluded her testimony, holding that it was inadmissible under Tennessee law.\n\nSpecifically, then-existing state precedent\u2014subsequently overruled\u2014held that expert testimony\n\non the subject of eyewitness identification was per se inadmissible under the state\u2019s rules of\n\nevidence. The court denied Thomas\u2019s motion for reconsideration and refused permission to take\n\nan interlocutory appeal.\n\n Thomas was tried in late April and early May 2003. When asked to identify the assailant\n\nin the courtroom, Yvonne walked toward Thomas, without objection, said, \u201cYou\u2019re the person\n\nthat murdered my husband,\u201d and pointed at him. When asked if she had any reason to identify\n\nthe wrong person, she replied, \u201cOf course not.\u201d She maintained that Thomas\u2019s picture in the\n\nnewspaper report had no impact on her identification, which was based solely \u201con the incident.\u201d\n\nShe noted that Thomas\u2019s hair \u201cseem[ed] a little darker\u201d at the time of trial than the assailant\u2019s\n\nwas at the time of the crimes and that Thomas had facial hair at trial. But she explained that\n\nthese differences had no impact because \u201c[t]he profile and the facial features are all the same.\u201d\n\nShe also identified a picture of Thomas as he looked in 1990, around eight months before the\n\nincident, in which he had brown hair and no facial hair. On cross examination, she\n\nacknowledged that another photograph\u2014taken three weeks before the killing\u2014also depicted\n\nThomas, even though he then had black hair and facial hair.\n\n Yvonne testified that, while in the van with the assailant, she focused largely on her\n\nhusband and on the gun pointed in her direction. Nonetheless, she maintained that she looked at\n\n\n -5-\n\fNo. 13-5414\nThomas v. Heidle\n\nthe person in the driver\u2019s seat because she had significant interaction with him while trying to\n\nconvince him to stop the van. She mostly saw him in profile because of her position relative to\n\nhis while he was driving. Although she had not seen any profile shots in the photo arrays, she\n\nhad seen the profile of potential suspects in a video lineup. On that occasion, she identified\n\n\u201cnumber four\u201d\u2014who was not Thomas\u2014as having the same profile and body language as the\n\nassailant. Thinking he was a very strong possibility, she requested a sample of his voice but\n\nnever received it.\n\n Ernest Salyer testified that he and Thomas were \u201c[b]est friends, like brothers,\u201d and that he\n\nhad known Thomas since Ernest was fourteen or fifteen years old. Early in the morning of\n\nMarch 23, 1991, Thomas called him and asked for a ride from a car wash. When Ernest arrived,\n\nThomas was wearing a bandana around his head. He had blood on his jacket and jeans and was\n\ncarrying a rifle wrapped in a blanket or a sheet. According to Ernest, Thomas told him he had\n\n\u201ckilled somebody,\u201d and later elaborated, saying that \u201che [ran] across the interstate and shot in a\n\nvehicle, jumped in it and drove them to Norris Freeway.\u201d Once there, Thomas apparently said\n\nthat he \u201cgot the woman out and he was going to shoot her and started reloading his gun and it got\n\njammed and a car come and scared him and he took off.\u201d Ernest testified that they then returned\n\nto his mother\u2019s house, where his mother washed Thomas\u2019s bloodstained clothes, believing he\n\nhad been in a fight. They next went to Ernest\u2019s grandmother\u2019s house, where they hid the rifle, a\n\n\u201c.22 semi-automatic.\u201d On their return to his mother\u2019s house, Ernest testified that Thomas gave\n\nErnest\u2019s sister, Sammie, a blood-stained twenty-dollar bill and told her to buy something for her\n\nson.\n\n Joyce Salyer, Ernest\u2019s mother, also testified. She said that Thomas called her house early\n\nin the morning on March 23, 1991, asking for Ernest. When Ernest returned with Thomas, she\n\n\n -6-\n\fNo. 13-5414\nThomas v. Heidle\n\ntestified that she washed Thomas\u2019s clothes because they were stained with blood and that he said\n\nhe had been in a fight. She also corroborated that Thomas gave Sammie a twenty-dollar bill that\n\n\u201chad red on it\u201d and \u201clooked like blood.\u201d Joyce later became suspicious about the story she had\n\nbeen told. After she watched the news report on the murder and asked Ernest, he told her what\n\nThomas had done. When she learned about the gun, she asked Ernest to retrieve it from her\n\nmother\u2019s house because her mother \u201cdidn\u2019t need to be involved in this.\u201d Joyce testified that she\n\nput Thomas\u2019s clothes in a plastic bag and gave them to her mother to burn.\n\n Joyce\u2019s mother, Goldie Storm, testified that Ernest and Thomas went to her house with a\n\n.22 rifle at around 11:00 a.m. on March 23. She agreed to keep the gun, and it remained in a\n\ncloset at her house until two days later, when the two men came back and removed it. She also\n\ntestified that Joyce gave her Thomas\u2019s clothes and that she burned them with the garbage.\n\nSammie testified that Thomas gave her a twenty-dollar bill with a red mark on it but said that she\n\ndid not know what the mark was and did not see Thomas with an entire pile of bills.\n\n Sandy Buckner, the defendant\u2019s friend, testified that she was working at a twenty-four-\n\nhour convenience store on the morning of March 23. Thomas came into the store early in the\n\nmorning and, at around 5:00 a.m., she gave him a ride home. She remembered that he was\n\nwearing boots, blue jeans, and a leather jacket, and that he had a bandana in his pocket or on his\n\nhead. When he returned to the store a few days later, Buckner noticed that he was dressed\n\ndifferently than usual and that his hair was now black. He told her that he had changed his hair\n\ncolor because he was \u201cafraid that he would . . . look like the guy in the newspaper.\u201d\n\n Loretta Strange also testified. Strange was dating Thomas in March 1993. Between that\n\ntime and the time of trial, however, she married and divorced Ernest Salyer. Strange said that\n\nshe and Thomas were watching television around two weeks after the crimes and that Thomas\n\n\n -7-\n\fNo. 13-5414\nThomas v. Heidle\n\nsaid, \u201cI\u2019m glad there\u2019s not cops like [those on the television] here [or] I\u2019d already be arrested.\u201d\n\nAsked to explain, he apparently said, \u201cThe guy on the interstate, I did that, and if there were cops\n\nlike that here, I\u2019d already be arrested.\u201d Later in 1991, she and Thomas were walking in the\n\nwoods and\u2014when they reached a freeway exit close to Thomas\u2019s house\u2014he said, \u201cThis is where\n\nI killed that guy . . . . [T]his is where I did it. I sat up here. I shot at cars, and I hit somebody.\u201d\n\nOn another occasion, he said that he \u201caccidentally\u201d hit somebody while shooting at cars. Strange\n\nasked what happened after he shot the person. According to her testimony, Thomas then\n\nexplained:\n\n He got in the van. He made the woman hold her husband as he drove her around.\n I don\u2019t know exactly where he drove her to. He\u2019d had a conversation with her\n saying that he was going to have to kill her because he wasn\u2019t going to prison and\n that she was the only witness. He was actually going to kill her. He had her\n outside of a van fixing to shoot her. A car came along; scared him. He got in the\n van and left.\n\n He threatened to kill Strange if she ever told anyone. When Thomas first told her about\n\nthe crimes, she thought he was \u201cjust talking to be talking,\u201d given his tendency to exaggerate and\n\n\u201c[t]alk tough.\u201d She also noticed that Thomas would only talk about the incident when he was\n\ndrunk. She later came to believe him, however, \u201cafter he showed me the trail and it was\n\nsomething that he had said on more than one occasion.\u201d Strange\u2019s testimony about how she\n\nlearned of the killing conflicted with her earlier account: when she initially spoke to the police,\n\nshe claimed that she was at Joyce\u2019s house when Thomas called and asked Salyer to pick him up\n\nat the car wash.\n\n Strange also testified that, shortly after March 23, Thomas said that his \u201cblack denim\n\njacket\u201d with \u201cbeads and leather patches on it\u201d had \u201cgone.\u201d He also dyed his hair black, \u201cso that\n\nit looked blue in the light.\u201d Before March, she explained, he had \u201ccut his hair and became kind\n\nof clean cut looking for a little while.\u201d\n\n -8-\n\fNo. 13-5414\nThomas v. Heidle\n\n By stipulation, the court admitted the testimony of Dr. Francis Jones, a retired pathologist\n\nwho performed John Cook\u2019s autopsy. He concluded that the cause of death was \u201ctwo small\n\ncaliber-.22 gunshot wounds to the head.\u201d The state also elicited testimony from four witnesses\n\nwho had been involved in the investigation as officers with the KPD and the Knox County\n\nSheriff\u2019s Department.\n\n After the state rested, Thomas elected not to put on any evidence. The jury acquitted\n\nThomas on the felony murder count and returned guilty verdicts as to all other charges. Based\n\non Yvonne\u2019s wishes, the state withdrew its notice of intent to seek the death penalty. At a\n\nseparate sentencing hearing, the court sentenced Thomas to life in prison for premeditated\n\nmurder, twenty-two years for especially aggravated robbery, and twenty-two years for especially\n\naggravated kidnapping, all to be served concurrently with the life sentence. The court also\n\nsentenced Thomas to twenty-five years for attempted first-degree murder, to be served\n\nconsecutively to the other sentences.\n\n On direct appeal, the Tennessee Court of Criminal Appeals affirmed Thomas\u2019s\n\nconvictions but reduced his sentences for attempted first-degree murder, especially aggravated\n\nrobbery, and especially aggravated kidnapping. State v. Thomas, No. E2003-02090-CCA-R3-\n\nCD, 2005 WL 735040, at *1 (Tenn. Crim. App. Mar. 30 2005). The Tennessee Supreme Court\n\ndenied permission to appeal. The United States Supreme Court denied Thomas\u2019s petition for\n\ncertiorari. Thomas v. Tennessee, 126 S. Ct. 1475 (2006). Thomas next sought state post-\n\nconviction relief. The trial court denied the petition, the Tennessee Court of Criminal Appeals\n\naffirmed the denial, and the Tennessee Supreme Court denied permission to appeal.\n\n Thomas\u2019s writ of habeas corpus followed, seeking relief on multiple grounds. The\n\ndistrict court denied the petition and also denied a certificate of appealability. Thomas v.\n\n\n -9-\n\fNo. 13-5414\nThomas v. Heidle\n\nCarlton, No. 3:10-cv-22, 2013 WL 1249601, at *20 (E.D. Tenn. Mar. 26, 2013). Thomas filed a\n\nnotice of appeal and requested a certificate of appealability from this court. This court granted\n\nthe application \u201cas to [Thomas\u2019s] claim that his rights were violated by a per se rule barring\n\nexpert proof on the unreliability of eyewitness identification.\u201d\n\n II.\n\n The single question at issue in this case is whether Thomas is entitled to habeas relief\n\nbased on the state court\u2019s per se rule barring expert testimony on the unreliability of eyewitness\n\nidentification. The answer must begin with a detailed review of the rule at issue and its\n\napplication in Thomas\u2019s case.\n\n Rule 702 of the Tennessee Rules of Evidence provides: \u201cIf scientific, technical, or other\n\nspecialized knowledge will substantially assist the trier of fact to understand the evidence or to\n\ndetermine a fact in issue, a witness qualified as an expert by knowledge, skill, experience,\n\ntraining, or education may testify in the form of an opinion or otherwise.\u201d In State v. Coley, 32\n\nS.W.3d 831 (Tenn. 2000), a 3-2 majority of the Tennessee Supreme Court read Rule 702 as\n\nessentially precluding expert testimony on the reliability of eyewitness identification. The\n\nmajority concluded that \u201c[e]yewitness testimony has no scientific or technical underpinnings\n\nwhich would be outside the common understanding of the jury; therefore, expert testimony is not\n\nnecessary to help jurors \u2018understand\u2019 the eyewitness\u2019s testimony.\u201d Id. at 833\u201334. The court also\n\nnoted that a defendant can adequately test an eyewitness\u2019s memory with cross-examination, that\n\njury instructions can guide the jury in its consideration of credibility, and that the proposed\n\nexpert testimony could confuse or mislead the jury or cause the jury to abandon its role as the\n\nfinder of fact. Id. at 836\u201337. Thus, the court held that \u201cgeneral and unparticularized expert\n\ntestimony concerning the reliability of eyewitness testimony, which is not specific to the witness\n\n\n -10-\n\fNo. 13-5414\nThomas v. Heidle\n\nwhose testimony is in question, does not substantially assist the trier of fact.\u201d Id. at 838. The\n\ntype of \u201cgeneral and unparticularized testimony\u201d at issue in Coley was similar to the type of\n\nevidence that Thomas sought to admit through his expert, Loftus. The expert in Coley planned to\n\ndiscuss:\n\n 1. the process of eyewitness identification; 2. the relationship between stress and\n memory of an event; 3. cross-racial identification; 4. the confidence the witnesses\n have in the accuracy of their identifications and the actual accuracy of their\n identifications; 5. the effect of time on the accuracy of memory; and 6. the\n suggestibility of the photographic line-up used in this case.\n\nId. at 833. In the present case, Loftus explained that her scientific expertise did not enable her to\n\nopine on \u201cwhether [Yvonne Cook\u2019s identification] is accurate or not.\u201d Nonetheless, she planned\n\nto apply the general principles from the field to the specific circumstances of the case. She\n\nreviewed police reports, Yvonne\u2019s interview from the day of the killing, the composite sketch of\n\nthe assailant, and the newspaper article that Yvonne saw soon after Thomas\u2019s arrest. She\n\nidentified \u201cfactors present in the current case that are known to create problems for accurate\n\neyewitness testimony.\u201d These included extreme stress and fright, \u201cweapon focus,\u201d an\n\n\u201cextraordinarily long retention interval,\u201d and identification based on a \u201chighly suggestive\n\nnewspaper article.\u201d Loftus stated that hypnosis can increase a witness\u2019s confidence. She further\n\nexplained that confidence is malleable\u2014meaning it is influenced by suggestive information\u2014\n\nand is \u201conly weakly related to accuracy.\u201d She explained, moreover, that \u201c[m]any of the factors\u201d\n\nshe planned to discuss \u201care not clearly understood by jurors, and some are even in conflict with\n\nmisconceptions that jurors have about the workings of memory.\u201d Although she could not\n\ncomment specifically on the accuracy of the identification in the case, she could \u201ccorrect some of\n\nthe misperceptions\u201d some jurors have about memory, giving the jury a scientific basis to \u201cmake\n\nits own decision.\u201d\n\n\n -11-\n\fNo. 13-5414\nThomas v. Heidle\n\n After the evidentiary hearing, the trial court found as follows:\n\n The proffered testimony of Dr. Loftus is remarkably similar to the testimony\n considered in the Coley case. Specifically, Dr. Loftus\u2019 opinion included\n information regarding the process of identification, the relationship of stress to\n memory, the difference between confidence in identification and actual accuracy\n of identification, the effect of passage of time on the identification and the\n suggestibility of photograph[s] presented to the witness in this case. The only\n significant difference between the testimony proffered in this case and that of\n Coley is that there is no cross-racial consideration presented in this case. The\n court did not find anything in Dr. Loftus\u2019 testimony to be of a more specific\n nature than that addressed in Coley. Therefore, the court finds that the evidence is\n per se inadmissible.\n\n The Tennessee Court of Criminal Appeals rejected Thomas\u2019s claim that the exclusion of\n\nLoftus\u2019s testimony violated his due-process rights. In particular, the court held that the proposed\n\ntestimony was not critical to his defense. Thomas, 2005 WL 735040, at *25. We consider below\n\nwhether that state-court decision entitles Thomas to habeas relief.\n\n The language of both the Coley court and the trial court in Thomas\u2019s case suggests,\n\nperhaps, that there would be room for expert testimony on eyewitness identification in some\n\ncircumstances, provided it is not \u201cgeneral and unparticularized.\u201d See Coley, 32 S.W.3d at 838.\n\nIn practice, however, the rule has functioned as a flat bar on expert testimony in this area.\n\nIndeed, in a subsequent case, the Tennessee Supreme Court characterized the holding of Coley as\n\nestablishing a rule that \u201cno one, regardless of credentials or experience and no matter how\n\nquestionable the evidence, can provide testimony on the issue of eyewitness identification.\u201d\n\nState v. Copeland, 226 S.W.3d 287, 300\u201301 (Tenn. 2007).\n\n But four years after Thomas\u2019s trial, the Tennessee Supreme Court reversed course,\n\nexplicitly overruling Coley in its unanimous opinion in Copeland, 226 S.W.3d 287. It\n\nacknowledged \u201cthe educational training of the experts and the empirical science behind the\n\nreliability of eyewitness testimony.\u201d Id. at 299. The court also recognized \u201cthat neither cross-\n\n\n -12-\n\fNo. 13-5414\nThomas v. Heidle\n\nexamination nor jury instructions on the issue are sufficient to educate the jury on the problems\n\nwith eyewitness identification.\u201d Id. at 300. Rather than maintaining the per se exclusion, the\n\ncourt gave trial courts discretion to assess the reliability of eyewitness-identification testimony as\n\nthey would assess any other expert evidence under Rule 702. Id. at 300 (citing McDaniel v. CSX\n\nTransp., Inc., 955 S.W.2d 257 (Tenn. 1997)). Tennessee courts now admit expert testimony on\n\neyewitness identification if it appears that the expert\u2019s opinion is \u201cbased on relevant scientific\n\nmethods, processes, and data, and not upon an expert\u2019s mere speculation.\u201d Id. at 301 (quoting\n\nMcDaniel, 955 S.W.2d at 265).\n\n III.\n\n The Antiterrorism and Effective Death Penalty Act of 1996 (AEDPA) provides that a\n\nfederal court may not grant a writ of habeas corpus after a state court has adjudicated the case on\n\nthe merits unless the state court\u2019s decision:\n\n (1) resulted in a decision that was contrary to, or involved an unreasonable\n application of, clearly established Federal law, as determined by the Supreme\n Court of the United States; or (2) resulted in a decision that was based on an\n unreasonable determination of the facts in light of the evidence presented in the\n State court proceeding.\n\n28 U.S.C. \u00a7 2254(d).\n\n A decision is \u201ccontrary to\u201d clearly established law if \u201cthe state court applies a rule that\n\ncontradicts the governing law set forth in [the Supreme Court\u2019s] cases,\u201d or if \u201cthe state court\n\nconfronts a set of facts that are materially indistinguishable\u201d from a Supreme Court decision and\n\nnevertheless arrives at a different result. Williams v. Taylor, 529 U.S. 362, 405\u201306 (2000). An\n\n\u201cunreasonable application\u201d of clearly established federal law occurs \u201cif the state court identifies\n\nthe correct governing legal principle from [the Supreme] Court\u2019s decisions but unreasonably\n\napplies that principle to the facts of the [petitioner\u2019s] case.\u201d Id. at 413.\n\n\n -13-\n\fNo. 13-5414\nThomas v. Heidle\n\n \u201cEven a general standard may be applied in an unreasonable manner.\u201d Panetti v.\n\nQuarterman, 551 U.S. 930, 953 (2007). A federal court need not therefore \u201c\u2018wait for some\n\nnearly identical factual pattern before a legal rule must be applied.\u2019\u201d Id. (quoting Carey v.\n\nMusladin, 549 U.S. 70, 81 (2006) (Kennedy, J., concurring in judgment)). Rather, a federal\n\ncourt may hold that a general principle was applied unreasonably on facts \u201c\u2018different from those\n\nof the case in which the principle announced.\u2019\u201d Id. (quoting Lockyer v. Andrade, 538 U.S. 63, 76\n\n(2003)). But habeas relief may only be granted \u201cin cases where there is no possibility fairminded\n\njurists could disagree that the state court\u2019s decision conflicts with [the Supreme] Court\u2019s\n\nprecedents.\u201d Harrington v. Richter, 131 S. Ct. 770, 786 (2011). \u201cPut another way, \u2018a state\n\nprisoner must show that the state court\u2019s ruling on the claim being presented in federal court was\n\nso lacking in justification that there was an error well understood and comprehended in existing\n\nlaw beyond any possibility for fairminded disagreement.\u2019\u201d Abby v. Howe, 742 F.3d 221, 226\n\n(6th Cir. 2014) (quoting Harrington, 131 S. Ct. at 786\u201387).\n\n When reviewing the district court\u2019s denial of a habeas petition, this court reviews the\n\ndistrict court\u2019s legal conclusions de novo and its factual findings for clear error. Jackson v.\n\nBradshaw, 681 F.3d 753, 759 (6th Cir. 2012).\n\n IV.\n\n \u201cWhether rooted directly in the Due Process Clause of the Fourteenth Amendment, or in\n\nthe Compulsory Process or Confrontation clauses of the Sixth Amendment, the Constitution\n\nguarantees criminal defendants \u2018a meaningful opportunity to present a complete defense.\u2019\u201d\n\nCrane v. Kentucky, 476 U.S. 683, 690 (1986) (citations omitted) (quoting California v.\n\nTrombetta, 467 U.S. 479, 485 (1984)). The Supreme Court recognizes that \u201c[f]ew rights are\n\n\n\n\n -14-\n\fNo. 13-5414\nThomas v. Heidle\n\nmore fundamental than that of an accused to present witnesses in his own defense.\u201d Taylor v.\n\nIllinois, 484 U.S. 400, 408 (1988) (citing Chambers v. Mississippi, 410 U.S. 284, 302 (1973)).\n\n But even this fundamental interest may \u201c\u2018bow to accommodate other legitimate interests\n\nin the criminal trial process.\u2019\u201d Rock v. Arkansas, 483 U.S. 44, 55 (1987) (quoting Chambers,\n\n410 U.S. at 295)). The Constitution allows states \u201cbroad latitude\u201d to implement rules excluding\n\nevidence from criminal trials. United States v. Scheffer, 523 U.S. 303, 308 (1998). These rules\n\nonly violate a defendant\u2019s right to present a defense if they are \u201carbitrary\u201d or \u201cdisproportionate to\n\nthe purposes they are designed to serve.\u201d Rock, 484 U.S. at 56. The Supreme Court has held\n\nthat the exclusion of evidence is arbitrary or disproportionate \u201conly where it has infringed upon a\n\nweighty interest of the accused.\u201d Scheffer, 523 U.S. at 308 (citing Rock, 484 U.S. at 58;\n\nChambers, 410 U.S. at 302; Washington v. Texas, 388 U.S. 14, 22\u201323 (1967)). Even if the\n\nexclusion of evidence has infringed upon a weighty interest of the accused in an arbitrary or\n\ndisproportionate manner, the consequent constitutional error must be reviewed for harmlessness\n\npursuant to Brecht v. Abrahamson, 507 U.S. 619 (1993). See Ferensic v. Birkett, 501 F.3d 469,\n\n472 (6th Cir. 2007).\n\n This case therefore presents three sub-issues. First, did the exclusion of the expert\n\ntestimony on eyewitness identification infringe upon Thomas\u2019s weighty interest based on clearly\n\nestablished Supreme Court law? Second, if so, did it do so in a way that was\u2014again according\n\nto clearly established law\u2014arbitrary or disproportionate to the purpose the exclusion was\n\ndesigned to serve? Finally, if it did, was the error harmless?\n\n A.\n\n We have previously held, applying clearly established law announced by the Supreme\n\nCourt, that a criminal defendant has a \u201cweighty\u201d interest in having an expert on eyewitness\n\n\n -15-\n\fNo. 13-5414\nThomas v. Heidle\n\nidentification testify in a case in which the prosecution relied on eyewitness evidence. See\n\nFerensic, 501 F.3d at 478 (citing Scheffer, 523 U.S. at 308). The clearly established law to be\n\nconsidered under AEDPA only extends to law that the Supreme Court\u2014not this court\u2014has\n\nannounced. 28 U.S.C. \u00a7 2254(d); see also Renico v. Lett, 559 U.S. 766, 778\u201379 (2010); Carey v.\n\nMusladin, 549 U.S. 70, 72\u201378 (2006); Coles v. Smith, 577 F. App\u2019x 502, 507\u201308 (6th Cir. 2014)\n\n(discussing the impact of Renico). But we are bound by our prior precedents when they have\n\ndetermined\u2014based on Supreme Court case law\u2014that a particular law is clearly established.\n\nSmith v. Stegall, 385 F.3d 993, 998 (6th Cir. 2004) (\u201cWe are . . . bound by any prior Sixth Circuit\n\ndecisions concluding that federal law on a particular issue has been \u2018clearly established\u2019 by\n\ncertain holdings of the Supreme Court.\u201d). Here, we are bound by the Ferensic court\u2019s\n\ndetermination that the relevant interest was weighty because, as we will explain, that case and\n\nthe present case involve the same material circumstances and the same clearly established\n\nSupreme Court law.\n\n In Ferensic, the state trial court ruled that the defense expert on eyewitness identification\n\ncould testify provided the defendant furnished a copy of the expert\u2019s report to the state at least\n\ntwo months before trial. Id. at 471. But the defendant turned over the report only eleven days\n\nbefore trial. Id. The trial court excluded the expert evidence because it was too late for the\n\nprosecution to retain its own expert in the field without delaying trial. Id. This court affirmed\n\nthe district court\u2019s grant of habeas relief. Id. at 484. It reasoned that the exclusion of the expert\n\ntestimony\u2014an exclusion that infringed the defendant\u2019s weighty interest\u2014was both arbitrary and\n\ndisproportionate because lesser sanctions were available and the government was not prejudiced\n\nby the late production of the expert report. See id. at 478. Finally, the court held that the state\n\ncourt\u2019s error\u2014in light of all the circumstances of the case\u2014was not harmless. See id. at 480\u201384.\n\n\n -16-\n\fNo. 13-5414\nThomas v. Heidle\n\n Two aspects of the Ferensic court\u2019s analysis can be distinguished from the present case.\n\nFirst, the fact that the exclusion in Ferensic was arbitrary and disproportionate has little bearing\n\non whether the exclusion of Loftus\u2019s testimony was either arbitrary or disproportionate. The trial\n\ncourt in Ferensic excluded the evidence based on a mere discovery violation, while Loftus\u2019s\n\ntestimony was excluded for evidentiary reasons. Second, the harmless-error review in Ferensic\n\nnecessarily surveyed all of the other evidence presented in that case. The present case requires a\n\nsimilarly fact-intensive inquiry.\n\n Despite these differences, Ferensic binds this court as to the importance of the accused\u2019s\n\ninterest because Thomas\u2019s interest in having Loftus testify at trial was materially\n\nindistinguishable from the defendant\u2019s interest in having his expert testify in Ferensic.2 The\n\nidentity of the perpetrator was the central issue in both trials, see id. at 471, the prosecution in\n\nboth trials relied upon an eyewitness, see id. at 470, and both defendants sought the admission of\n\nexpert testimony on the impact of relevant, case-specific factors on an eyewitness\u2019s recollection,\n\nsee id. at 471\u201372.\n\n The Ferensic court held that the petitioner had a weighty interest in introducing expert\n\ntestimony on eyewitness identification within the meaning of clearly established law as\n\ndetermined by the Supreme Court. Id. at 478. The Ferensic panel explained: \u201c[E]yewitness\n\nmisidentification accounts for more false convictions in the United States than any other factor.\u201d\n\nId. The court also reasoned that \u201cthe current near-universal acceptance of the reliability of expert\n\ntestimony regarding eyewitness identification\u201d distinguishes the per se exclusion of this\n\ntestimony from the per se exclusion of polygraph evidence that the Supreme Court upheld in\n\n 2\n The Ferensic court explicitly limited its holding \u201cto the situation here where the record reflects the doubts\nof the jury itself as to the identification of the perpetrator.\u201d 501 F.3d at 484. But this appears to limit the holding\nonly as it applies to the harmless-error analysis because the jury\u2019s doubts are relevant to prejudice but not to the\nweight of the defendant\u2019s interest. Also, the court explained that it was limiting its holding in order to preserve\nconsistency with other holdings that were germane only as to harmlessness. See id. at 483\u201384.\n\n -17-\n\fNo. 13-5414\nThomas v. Heidle\n\nScheffer. Id. (citing Scheffer, 523 U.S. at 309). These factors made the admission of the expert\n\ntestimony a matter of fundamental importance to the defendant. Id.\n\n Given Thomas\u2019s weighty interest in the presentation of Loftus\u2019s testimony, we must next\n\nconsider whether the exclusion of that testimony was arbitrary and disproportionate in a\n\nconstitutional sense.\n\n\n\n B.\n\n The Supreme Court has made clear that, even when a state\u2019s evidentiary rules impinge on\n\na defendant\u2019s weighty interest, those rules are unconstitutional only if\u2014in the specific\n\ncircumstances of the case\u2014they are \u201carbitrary\u201d or \u201cdisproportionate to the purposes they are\n\ndesigned to serve.\u201d Rock, 483 U.S. at 56; see also Michigan v. Lucas, 500 U.S. 145, 151 (1991).\n\nIn considering whether the exclusion was arbitrary or disproportionate, the Court has considered\n\nthe potential prejudice that the State and the adversary system would suffer if the evidence were\n\nadmitted. See Taylor v. Illinois, 484 U.S. 400, 410\u201315 (1988).\n\n Here, the exclusion was not arbitrary or disproportionate in a constitutional sense, let\n\nalone a violation or unreasonable application of clearly established constitutional law. It was not\n\narbitrary or disproportionate for the court to reach the conclusion that testimony like Loftus\u2019s\n\nwould not \u201csubstantially assist the trier of fact.\u201d See Tenn. R. Evid. 702. Though the Tennessee\n\ncourts later reversed course on admitting expert evidence on eyewitness identification, this does\n\nnot mean that the trial court\u2019s exclusion of Loftus\u2019s testimony was arbitrary or disproportionate\n\nto the interests it was designed to serve. It was within the province of the trial court to determine\n\nthat admitting the evidence was likely to cause juror confusion, encroach upon the jury\u2019s role of\n\n\n\n\n -18-\n\fNo. 13-5414\nThomas v. Heidle\n\ndetermining witness credibility, and lend no significant help to the jury. We therefore hold that\n\nthere was no constitutional violation.3\n\n Even if Thomas could show a potential violation\u2014the kind of constitutional error that we\n\ncould correct on direct appeal\u2014this would not be sufficient to obtain relief under AEDPA\n\nbecause there is no violation of, or unreasonable application of, clearly established law as\n\ndetermined by the Supreme Court. See 28 U.S.C. \u00a7 2254(d). The Supreme Court has not\n\ndirectly spoken on the law applicable to the circumstances of this case. And we can grant relief\n\nonly if we conclude that the exclusion of Loftus\u2019s testimony in this particular case was \u201cso\n\nlacking in justification that there was an error well understood and comprehended in existing law\n\nbeyond any possibility for fairminded disagreement.\u201d Harrington, 131 S. Ct. at 786\u201387.\n\nPresent-day case law demonstrates that fair-minded jurists still disagree on the exclusion of\n\nexpert testimony on eyewitness identification, even when it is effectively excluded on a blanket\n\nbasis. The Eleventh Circuit has \u201c\u2018consistently looked unfavorably on\u2019 expert testimony about\n\neyewitness reliability and held that \u2018a district court does not abuse its discretion when it excludes\n\nexpert testimony on eyewitness identification.\u2019\u201d United States v. Owens, 445 F. App\u2019x 209, 216\n\n(11th Cir. 2011) (per curiam) (quoting United States v. Smith, 122 F.3d 1355, 1357, 1359 (11th\n\nCir. 1997)). Though that decision is based on the Federal Rules of Evidence rather than the\n\nConstitution, it would be inconsistent for the Eleventh Circuit to sustain its position on direct\n\nappeal while also viewing a per se exclusion rule as unconstitutional. Similarly, many states\u2014\n\nincluding several that maintain per se exclusions\u2014would resolve the question the same way\n\ntoday, suggesting that the Supreme Court\u2019s precedents have not led all fairminded jurists to\n\nadopt Thomas\u2019s preferred view. See, e.g., State v. Young, 35 So. 3d 1042, 1046\u201350 (La. 2010);\n\n 3\n We reached the same conclusion in Buell v. Mitchell, 274 F.3d 337, 359 (6th Cir. 2001), and Moore v.\nTate, 882 F.2d 1107, 1110 (6th Cir. 1989) (per curiam).\n\n -19-\n\fNo. 13-5414\nThomas v. Heidle\n\nState v. George, 645 N.W.2d 777, 790 (Neb. 2002); State v. Goldsby, 650 P.2d 952, 954 (Or. Ct.\n\nApp. 1982). See generally George Vallas, A Survey of Federal and State Standards for the\n\nAdmission of Expert Testimony on the Reliability of Eyewitnesses, 39 Am. J. Crim. L. 97, app. B\n\n(2011) (summarizing the various ways in which state courts address the admission of expert\n\ntestimony on eyewitness identification). The retention of this position by fairminded jurists is\n\ninstructive. And while we are not bound to follow this position, the Supreme Court has not\n\nannounced any clearly established law that would license us to grant habeas relief here even if\n\nwe detected the kind of error that would merit reversal on direct appeal.\n\n There was no violation of, or unreasonable application of, clearly established law as\n\ndetermined by the Supreme Court. AEDPA relief is unwarranted.\n\n C.\n\n Finally, even if we were convinced that Thomas\u2019s clearly established rights had been\n\nviolated, Thomas would not be entitled to relief because the claimed error was harmless.4 A\n\nconstitutional error is not harmless for the purposes of habeas review if the error \u201chad substantial\n\nand injurious effect or influence in determining the jury\u2019s verdict.\u201d Brecht, 507 U.S. at 623\n\n(quoting Kotteakos v. United States, 328 U.S. 750, 776 (1946)). To resolve this issue, the court\n\nshould not use a mere sufficiency inquiry. See Ferensic, 501 F.3d at 483 (citing Kotteakos, 328\n\nU.S. at 765). Rather, the relevant question is \u201c\u2018Do I, the judge, think that the error substantially\n\ninfluenced the jury\u2019s decision?\u2019\u201d Id. at 481 (quoting O\u2019Neal v. McAninch, 513 U.S. 432, 436\u201337\n\n(1995)). If a judge has \u201cgrave doubt\u201d about the potentially substantial and injurious effect of the\n\n 4\n Thomas argues that the warden waived her harmless-error argument by failing to raise it before the\ndistrict court. We disagree. The warden argued in this district court that \u201ca wealth of other evidence independent of\nthe [eyewitness] victim was presented to establish the petitioner\u2019s identity as the perpetrator of this offense.\u201d Even\nif we believed this was insufficient to preserve the argument, we would still exercise our discretion to reach the issue\nbecause it is plainly apparent from the record that the error almost certainly had no substantial influence on the\noutcome. See Gover v. Perry, 698 F.3d 295, 300 (6th Cir. 2012) (discussing the factors that we consider in deciding\nwhether to exercise our discretion and reach a harmless-error issue that has been waived).\n\n -20-\n\fNo. 13-5414\nThomas v. Heidle\n\nerror, the error is not harmless \u201c[a]nd the petitioner must win.\u201d O\u2019Neal, 513 at 436. In other\n\nwords, \u201c[u]ncertainty in answering this question . . . militates in favor of the habeas petitioner.\u201d\n\nFerensic, 501 F.3d at 481.\n\n Even if we believed that the state trial court\u2019s exclusion of Loftus\u2019s testimony was\n\nerroneous, we would not be left with grave doubt. Assuming that the trial court had permitted\n\nLoftus to testify\u2014and even assuming further that her testimony had led the jury to believe that\n\ndeficiencies existed in Yvonne\u2019s testimony\u2014a substantial volume of additional evidence still\n\nindicated Thomas\u2019s guilt. Ernest Salyer admitted helping Thomas hide the gun. Joyce Salyer\n\nand Goldie Storm corroborated his account. Joyce further testified that she washed blood from\n\nThomas\u2019s clothes on the morning of March 23. There was evidence that Thomas gave Sammie\n\nSalyer cash with red marks on it. According to Ernest, Thomas admitted that he \u201ckilled\n\nsomebody,\u201d and gave further information about the crime. Loretta Strange also recounted first-\n\nhand details that Thomas told her about John Cook\u2019s murder. In its closing argument, the\n\nprosecution relied heavily on this entire web of incriminating evidence. It pointed to the\n\neyewitness identification as just one of several pieces of strong, materially consistent evidence\n\nindicating Thomas\u2019s guilt.\n\n Thomas argues that we should have grave doubt in this case, just as the Ferensic panel\u2019s\n\ngrave doubt led it to grant the petitioner relief. But Ferensic is plainly distinguishable. There,\n\n\u201c[t]he entirety of the evidence against [the defendant] was based upon eyewitness identifications\n\nmade by the victimized couple.\u201d Id. at 470; see also id. at 481\u201384 (applying the harmless-error\n\ntest). The court also placed great weight on the fact that a note from the jury to the trial judge\n\nduring deliberations indicated that the jury had doubts about the strength of the case against the\n\ndefendant and, in particular, questioned evidence related to the eyewitness identification. Id. at\n\n\n -21-\n\fNo. 13-5414\nThomas v. Heidle\n\n483\u201384. Indeed, the panel explicitly limited its holding \u201cto the situation here where the record\n\nreflects the doubts of the jury itself as to the identification of the perpetrator.\u201d Id. at 484.\n\n Here, viewing all of the evidence together, we do not believe that the exclusion of the\n\nexpert testimony had a substantial and injurious effect on the jury\u2019s verdict. We therefore\n\nbelieve that the exclusion was harmless and would not merit relief even if we believed it was\n\nerroneous.\n\n V.\n\n For the foregoing reasons, we affirm the district court\u2019s denial of habeas relief.\n\n\n\n\n -22-\n\f"]]
\ No newline at end of file
{"results": {"pile_freelaw": {"word_perplexity": 1.0002155059410578, "byte_perplexity": 1.0000330805725068, "bits_per_byte": 3.3080025356816895e-05}}, "versions": {"pile_freelaw": 0}} {"results": {"pile_freelaw": {"bits_per_byte": 3.16238943008513e-05, "byte_perplexity": 1.0000316243943415, "word_perplexity": 1.000203169094218}}, "versions": {"pile_freelaw": 0}}
\ No newline at end of file \ No newline at end of file
df384c3df3d8f53273e97127c5bb84c17e638acad7d6bc9c91f6dee96d43b639
\ No newline at end of file
[["<!DOCTYPE html>\n<html>\n<head>\n\t<meta charset=\"utf-8\">\n\t<link rel=\"shortcut icon\" type=\"image/ico\" href=\"http://www.datatables.net/favicon.ico\">\n\t<meta name=\"viewport\" content=\"initial-scale=1.0, maximum-scale=2.0\">\n\n\t<title>ColVis example - Custom button text</title>\n\t<link rel=\"stylesheet\" type=\"text/css\" href=\"../../../media/css/jquery.dataTables.css\">\n\t<link rel=\"stylesheet\" type=\"text/css\" href=\"../css/dataTables.colVis.css\">\n\t<link rel=\"stylesheet\" type=\"text/css\" href=\"../../../examples/resources/syntax/shCore.css\">\n\t<link rel=\"stylesheet\" type=\"text/css\" href=\"../../../examples/resources/demo.css\">\n\t<style type=\"text/css\" class=\"init\">\n\n\t</style>\n\t<script type=\"text/javascript\" language=\"javascript\" src=\"../../../media/js/jquery.js\"></script>\n\t<script type=\"text/javascript\" language=\"javascript\" src=\"../../../media/js/jquery.dataTables.js\"></script>\n\t<script type=\"text/javascript\" language=\"javascript\" src=\"../js/dataTables.colVis.js\"></script>\n\t<script type=\"text/javascript\" language=\"javascript\" src=\"../../../examples/resources/syntax/shCore.js\"></script>\n\t<script type=\"text/javascript\" language=\"javascript\" src=\"../../../examples/resources/demo.js\"></script>\n\t<script type=\"text/javascript\" language=\"javascript\" class=\"init\">\n\n\n$(document).ready(function() {\n\t$('#example').DataTable( {\n\t\t\"dom\": 'C<\"clear\">lfrtip',\n\t\t\"colVis\": {\n\t\t\t\"buttonText\": \"Change columns\"\n\t\t}\n\t} );\n} );\n\n\n\t</script>\n</head>\n\n<body class=\"dt-example\">\n\t<div class=\"container\">\n\t\t<section>\n\t\t\t<h1>ColVis example <span>Custom button text</span></h1>\n\n\t\t\t<div class=\"info\">\n\t\t\t\t<p>You may wish to use your own text in the ColVis button - this is done by making use of the <code>buttonText</code> initialisation option, as shown in this\n\t\t\t\texample.</p>\n\n\t\t\t\t<p>For full information about the ColVis options, please refer to the <a href=\"//datatables.net/extensions/colvis/options\">ColVis options documentation</a>.</p>\n\t\t\t</div>\n\n\t\t\t<table id=\"example\" class=\"display\" cellspacing=\"0\" width=\"100%\">\n\t\t\t\t<thead>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<th>Name</th>\n\t\t\t\t\t\t<th>Position</th>\n\t\t\t\t\t\t<th>Office</th>\n\t\t\t\t\t\t<th>Age</th>\n\t\t\t\t\t\t<th>Start date</th>\n\t\t\t\t\t\t<th>Salary</th>\n\t\t\t\t\t</tr>\n\t\t\t\t</thead>\n\n\t\t\t\t<tfoot>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<th>Name</th>\n\t\t\t\t\t\t<th>Position</th>\n\t\t\t\t\t\t<th>Office</th>\n\t\t\t\t\t\t<th>Age</th>\n\t\t\t\t\t\t<th>Start date</th>\n\t\t\t\t\t\t<th>Salary</th>\n\t\t\t\t\t</tr>\n\t\t\t\t</tfoot>\n\n\t\t\t\t<tbody>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Tiger Nixon</td>\n\t\t\t\t\t\t<td>System Architect</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>61</td>\n\t\t\t\t\t\t<td>2011/04/25</td>\n\t\t\t\t\t\t<td>$320,800</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Garrett Winters</td>\n\t\t\t\t\t\t<td>Accountant</td>\n\t\t\t\t\t\t<td>Tokyo</td>\n\t\t\t\t\t\t<td>63</td>\n\t\t\t\t\t\t<td>2011/07/25</td>\n\t\t\t\t\t\t<td>$170,750</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Ashton Cox</td>\n\t\t\t\t\t\t<td>Junior Technical Author</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>66</td>\n\t\t\t\t\t\t<td>2009/01/12</td>\n\t\t\t\t\t\t<td>$86,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Cedric Kelly</td>\n\t\t\t\t\t\t<td>Senior Javascript Developer</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>22</td>\n\t\t\t\t\t\t<td>2012/03/29</td>\n\t\t\t\t\t\t<td>$433,060</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Airi Satou</td>\n\t\t\t\t\t\t<td>Accountant</td>\n\t\t\t\t\t\t<td>Tokyo</td>\n\t\t\t\t\t\t<td>33</td>\n\t\t\t\t\t\t<td>2008/11/28</td>\n\t\t\t\t\t\t<td>$162,700</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Brielle Williamson</td>\n\t\t\t\t\t\t<td>Integration Specialist</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>61</td>\n\t\t\t\t\t\t<td>2012/12/02</td>\n\t\t\t\t\t\t<td>$372,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Herrod Chandler</td>\n\t\t\t\t\t\t<td>Sales Assistant</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>59</td>\n\t\t\t\t\t\t<td>2012/08/06</td>\n\t\t\t\t\t\t<td>$137,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Rhona Davidson</td>\n\t\t\t\t\t\t<td>Integration Specialist</td>\n\t\t\t\t\t\t<td>Tokyo</td>\n\t\t\t\t\t\t<td>55</td>\n\t\t\t\t\t\t<td>2010/10/14</td>\n\t\t\t\t\t\t<td>$327,900</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Colleen Hurst</td>\n\t\t\t\t\t\t<td>Javascript Developer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>39</td>\n\t\t\t\t\t\t<td>2009/09/15</td>\n\t\t\t\t\t\t<td>$205,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Sonya Frost</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>23</td>\n\t\t\t\t\t\t<td>2008/12/13</td>\n\t\t\t\t\t\t<td>$103,600</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jena Gaines</td>\n\t\t\t\t\t\t<td>Office Manager</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>30</td>\n\t\t\t\t\t\t<td>2008/12/19</td>\n\t\t\t\t\t\t<td>$90,560</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Quinn Flynn</td>\n\t\t\t\t\t\t<td>Support Lead</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>22</td>\n\t\t\t\t\t\t<td>2013/03/03</td>\n\t\t\t\t\t\t<td>$342,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Charde Marshall</td>\n\t\t\t\t\t\t<td>Regional Director</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>36</td>\n\t\t\t\t\t\t<td>2008/10/16</td>\n\t\t\t\t\t\t<td>$470,600</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Haley Kennedy</td>\n\t\t\t\t\t\t<td>Senior Marketing Designer</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>43</td>\n\t\t\t\t\t\t<td>2012/12/18</td>\n\t\t\t\t\t\t<td>$313,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Tatyana Fitzpatrick</td>\n\t\t\t\t\t\t<td>Regional Director</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>19</td>\n\t\t\t\t\t\t<td>2010/03/17</td>\n\t\t\t\t\t\t<td>$385,750</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Michael Silva</td>\n\t\t\t\t\t\t<td>Marketing Designer</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>66</td>\n\t\t\t\t\t\t<td>2012/11/27</td>\n\t\t\t\t\t\t<td>$198,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Paul Byrd</td>\n\t\t\t\t\t\t<td>Chief Financial Officer (CFO)</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>64</td>\n\t\t\t\t\t\t<td>2010/06/09</td>\n\t\t\t\t\t\t<td>$725,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Gloria Little</td>\n\t\t\t\t\t\t<td>Systems Administrator</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>59</td>\n\t\t\t\t\t\t<td>2009/04/10</td>\n\t\t\t\t\t\t<td>$237,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Bradley Greer</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>41</td>\n\t\t\t\t\t\t<td>2012/10/13</td>\n\t\t\t\t\t\t<td>$132,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Dai Rios</td>\n\t\t\t\t\t\t<td>Personnel Lead</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>35</td>\n\t\t\t\t\t\t<td>2012/09/26</td>\n\t\t\t\t\t\t<td>$217,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jenette Caldwell</td>\n\t\t\t\t\t\t<td>Development Lead</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>30</td>\n\t\t\t\t\t\t<td>2011/09/03</td>\n\t\t\t\t\t\t<td>$345,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Yuri Berry</td>\n\t\t\t\t\t\t<td>Chief Marketing Officer (CMO)</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>40</td>\n\t\t\t\t\t\t<td>2009/06/25</td>\n\t\t\t\t\t\t<td>$675,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Caesar Vance</td>\n\t\t\t\t\t\t<td>Pre-Sales Support</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>21</td>\n\t\t\t\t\t\t<td>2011/12/12</td>\n\t\t\t\t\t\t<td>$106,450</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Doris Wilder</td>\n\t\t\t\t\t\t<td>Sales Assistant</td>\n\t\t\t\t\t\t<td>Sidney</td>\n\t\t\t\t\t\t<td>23</td>\n\t\t\t\t\t\t<td>2010/09/20</td>\n\t\t\t\t\t\t<td>$85,600</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Angelica Ramos</td>\n\t\t\t\t\t\t<td>Chief Executive Officer (CEO)</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>47</td>\n\t\t\t\t\t\t<td>2009/10/09</td>\n\t\t\t\t\t\t<td>$1,200,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Gavin Joyce</td>\n\t\t\t\t\t\t<td>Developer</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>42</td>\n\t\t\t\t\t\t<td>2010/12/22</td>\n\t\t\t\t\t\t<td>$92,575</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jennifer Chang</td>\n\t\t\t\t\t\t<td>Regional Director</td>\n\t\t\t\t\t\t<td>Singapore</td>\n\t\t\t\t\t\t<td>28</td>\n\t\t\t\t\t\t<td>2010/11/14</td>\n\t\t\t\t\t\t<td>$357,650</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Brenden Wagner</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>28</td>\n\t\t\t\t\t\t<td>2011/06/07</td>\n\t\t\t\t\t\t<td>$206,850</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Fiona Green</td>\n\t\t\t\t\t\t<td>Chief Operating Officer (COO)</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>48</td>\n\t\t\t\t\t\t<td>2010/03/11</td>\n\t\t\t\t\t\t<td>$850,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Shou Itou</td>\n\t\t\t\t\t\t<td>Regional Marketing</td>\n\t\t\t\t\t\t<td>Tokyo</td>\n\t\t\t\t\t\t<td>20</td>\n\t\t\t\t\t\t<td>2011/08/14</td>\n\t\t\t\t\t\t<td>$163,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Michelle House</td>\n\t\t\t\t\t\t<td>Integration Specialist</td>\n\t\t\t\t\t\t<td>Sidney</td>\n\t\t\t\t\t\t<td>37</td>\n\t\t\t\t\t\t<td>2011/06/02</td>\n\t\t\t\t\t\t<td>$95,400</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Suki Burks</td>\n\t\t\t\t\t\t<td>Developer</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>53</td>\n\t\t\t\t\t\t<td>2009/10/22</td>\n\t\t\t\t\t\t<td>$114,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Prescott Bartlett</td>\n\t\t\t\t\t\t<td>Technical Author</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>27</td>\n\t\t\t\t\t\t<td>2011/05/07</td>\n\t\t\t\t\t\t<td>$145,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Gavin Cortez</td>\n\t\t\t\t\t\t<td>Team Leader</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>22</td>\n\t\t\t\t\t\t<td>2008/10/26</td>\n\t\t\t\t\t\t<td>$235,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Martena Mccray</td>\n\t\t\t\t\t\t<td>Post-Sales support</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>46</td>\n\t\t\t\t\t\t<td>2011/03/09</td>\n\t\t\t\t\t\t<td>$324,050</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Unity Butler</td>\n\t\t\t\t\t\t<td>Marketing Designer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>47</td>\n\t\t\t\t\t\t<td>2009/12/09</td>\n\t\t\t\t\t\t<td>$85,675</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Howard Hatfield</td>\n\t\t\t\t\t\t<td>Office Manager</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>51</td>\n\t\t\t\t\t\t<td>2008/12/16</td>\n\t\t\t\t\t\t<td>$164,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Hope Fuentes</td>\n\t\t\t\t\t\t<td>Secretary</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>41</td>\n\t\t\t\t\t\t<td>2010/02/12</td>\n\t\t\t\t\t\t<td>$109,850</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Vivian Harrell</td>\n\t\t\t\t\t\t<td>Financial Controller</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>62</td>\n\t\t\t\t\t\t<td>2009/02/14</td>\n\t\t\t\t\t\t<td>$452,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Timothy Mooney</td>\n\t\t\t\t\t\t<td>Office Manager</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>37</td>\n\t\t\t\t\t\t<td>2008/12/11</td>\n\t\t\t\t\t\t<td>$136,200</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jackson Bradshaw</td>\n\t\t\t\t\t\t<td>Director</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>65</td>\n\t\t\t\t\t\t<td>2008/09/26</td>\n\t\t\t\t\t\t<td>$645,750</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Olivia Liang</td>\n\t\t\t\t\t\t<td>Support Engineer</td>\n\t\t\t\t\t\t<td>Singapore</td>\n\t\t\t\t\t\t<td>64</td>\n\t\t\t\t\t\t<td>2011/02/03</td>\n\t\t\t\t\t\t<td>$234,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Bruno Nash</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>38</td>\n\t\t\t\t\t\t<td>2011/05/03</td>\n\t\t\t\t\t\t<td>$163,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Sakura Yamamoto</td>\n\t\t\t\t\t\t<td>Support Engineer</td>\n\t\t\t\t\t\t<td>Tokyo</td>\n\t\t\t\t\t\t<td>37</td>\n\t\t\t\t\t\t<td>2009/08/19</td>\n\t\t\t\t\t\t<td>$139,575</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Thor Walton</td>\n\t\t\t\t\t\t<td>Developer</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>61</td>\n\t\t\t\t\t\t<td>2013/08/11</td>\n\t\t\t\t\t\t<td>$98,540</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Finn Camacho</td>\n\t\t\t\t\t\t<td>Support Engineer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>47</td>\n\t\t\t\t\t\t<td>2009/07/07</td>\n\t\t\t\t\t\t<td>$87,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Serge Baldwin</td>\n\t\t\t\t\t\t<td>Data Coordinator</td>\n\t\t\t\t\t\t<td>Singapore</td>\n\t\t\t\t\t\t<td>64</td>\n\t\t\t\t\t\t<td>2012/04/09</td>\n\t\t\t\t\t\t<td>$138,575</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Zenaida Frank</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>63</td>\n\t\t\t\t\t\t<td>2010/01/04</td>\n\t\t\t\t\t\t<td>$125,250</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Zorita Serrano</td>\n\t\t\t\t\t\t<td>Software Engineer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>56</td>\n\t\t\t\t\t\t<td>2012/06/01</td>\n\t\t\t\t\t\t<td>$115,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jennifer Acosta</td>\n\t\t\t\t\t\t<td>Junior Javascript Developer</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>43</td>\n\t\t\t\t\t\t<td>2013/02/01</td>\n\t\t\t\t\t\t<td>$75,650</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Cara Stevens</td>\n\t\t\t\t\t\t<td>Sales Assistant</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>46</td>\n\t\t\t\t\t\t<td>2011/12/06</td>\n\t\t\t\t\t\t<td>$145,600</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Hermione Butler</td>\n\t\t\t\t\t\t<td>Regional Director</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>47</td>\n\t\t\t\t\t\t<td>2011/03/21</td>\n\t\t\t\t\t\t<td>$356,250</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Lael Greer</td>\n\t\t\t\t\t\t<td>Systems Administrator</td>\n\t\t\t\t\t\t<td>London</td>\n\t\t\t\t\t\t<td>21</td>\n\t\t\t\t\t\t<td>2009/02/27</td>\n\t\t\t\t\t\t<td>$103,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Jonas Alexander</td>\n\t\t\t\t\t\t<td>Developer</td>\n\t\t\t\t\t\t<td>San Francisco</td>\n\t\t\t\t\t\t<td>30</td>\n\t\t\t\t\t\t<td>2010/07/14</td>\n\t\t\t\t\t\t<td>$86,500</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Shad Decker</td>\n\t\t\t\t\t\t<td>Regional Director</td>\n\t\t\t\t\t\t<td>Edinburgh</td>\n\t\t\t\t\t\t<td>51</td>\n\t\t\t\t\t\t<td>2008/11/13</td>\n\t\t\t\t\t\t<td>$183,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Michael Bruce</td>\n\t\t\t\t\t\t<td>Javascript Developer</td>\n\t\t\t\t\t\t<td>Singapore</td>\n\t\t\t\t\t\t<td>29</td>\n\t\t\t\t\t\t<td>2011/06/27</td>\n\t\t\t\t\t\t<td>$183,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t\t<tr>\n\t\t\t\t\t\t<td>Donna Snider</td>\n\t\t\t\t\t\t<td>Customer Support</td>\n\t\t\t\t\t\t<td>New York</td>\n\t\t\t\t\t\t<td>27</td>\n\t\t\t\t\t\t<td>2011/01/25</td>\n\t\t\t\t\t\t<td>$112,000</td>\n\t\t\t\t\t</tr>\n\t\t\t\t</tbody>\n\t\t\t</table>\n\n\t\t\t<ul class=\"tabs\">\n\t\t\t\t<li class=\"active\">Javascript</li>\n\t\t\t\t<li>HTML</li>\n\t\t\t\t<li>CSS</li>\n\t\t\t\t<li>Ajax</li>\n\t\t\t\t<li>Server-side script</li>\n\t\t\t</ul>\n\n\t\t\t<div class=\"tabs\">\n\t\t\t\t<div class=\"js\">\n\t\t\t\t\t<p>The Javascript shown below is used to initialise the table shown in this example:</p><code class=\"multiline language-js\">$(document).ready(function() {\n\t$('#example').DataTable( {\n\t\t&quot;dom&quot;: 'C&lt;&quot;clear&quot;&gt;lfrtip',\n\t\t&quot;colVis&quot;: {\n\t\t\t&quot;buttonText&quot;: &quot;Change columns&quot;\n\t\t}\n\t} );\n} );</code>\n\n\t\t\t\t\t<p>In addition to the above code, the following Javascript library files are loaded for use in this example:</p>\n\n\t\t\t\t\t<ul>\n\t\t\t\t\t\t<li><a href=\"../../../media/js/jquery.js\">../../../media/js/jquery.js</a></li>\n\t\t\t\t\t\t<li><a href=\"../../../media/js/jquery.dataTables.js\">../../../media/js/jquery.dataTables.js</a></li>\n\t\t\t\t\t\t<li><a href=\"../js/dataTables.colVis.js\">../js/dataTables.colVis.js</a></li>\n\t\t\t\t\t</ul>\n\t\t\t\t</div>\n\n\t\t\t\t<div class=\"table\">\n\t\t\t\t\t<p>The HTML shown below is the raw HTML table element, before it has been enhanced by DataTables:</p>\n\t\t\t\t</div>\n\n\t\t\t\t<div class=\"css\">\n\t\t\t\t\t<div>\n\t\t\t\t\t\t<p>This example uses a little bit of additional CSS beyond what is loaded from the library files (below), in order to correctly display the table. The\n\t\t\t\t\t\tadditional CSS used is shown below:</p><code class=\"multiline language-css\"></code>\n\t\t\t\t\t</div>\n\n\t\t\t\t\t<p>The following CSS library files are loaded for use in this example to provide the styling of the table:</p>\n\n\t\t\t\t\t<ul>\n\t\t\t\t\t\t<li><a href=\"../../../media/css/jquery.dataTables.css\">../../../media/css/jquery.dataTables.css</a></li>\n\t\t\t\t\t\t<li><a href=\"../css/dataTables.colVis.css\">../css/dataTables.colVis.css</a></li>\n\t\t\t\t\t</ul>\n\t\t\t\t</div>\n\n\t\t\t\t<div class=\"ajax\">\n\t\t\t\t\t<p>This table loads data by Ajax. The latest data that has been loaded is shown below. This data will update automatically as any additional data is\n\t\t\t\t\tloaded.</p>\n\t\t\t\t</div>\n\n\t\t\t\t<div class=\"php\">\n\t\t\t\t\t<p>The script used to perform the server-side processing for this table is shown below. Please note that this is just an example script using PHP. Server-side\n\t\t\t\t\tprocessing scripts can be written in any language, using <a href=\"//datatables.net/manual/server-side\">the protocol described in the DataTables\n\t\t\t\t\tdocumentation</a>.</p>\n\t\t\t\t</div>\n\t\t\t</div>\n\t\t</section>\n\t</div>\n\n\t<section>\n\t\t<div class=\"footer\">\n\t\t\t<div class=\"gradient\"></div>\n\n\t\t\t<div class=\"liner\">\n\t\t\t\t<h2>Other examples</h2>\n\n\t\t\t\t<div class=\"toc\">\n\t\t\t\t\t<div class=\"toc-group\">\n\t\t\t\t\t\t<h3><a href=\"./index.html\">Examples</a></h3>\n\t\t\t\t\t\t<ul class=\"toc active\">\n\t\t\t\t\t\t\t<li><a href=\"./simple.html\">Basic initialisation</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./new_init.html\">`new` initialisation</a></li>\n\t\t\t\t\t\t\t<li class=\"active\"><a href=\"./text.html\">Custom button text</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./exclude_columns.html\">Exclude columns from list</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./title_callback.html\">Column button callback</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./button_order.html\">Button ordering</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./mouseover.html\">Mouseover activation</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./group_columns.html\">Group columns</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./two_tables.html\">Two tables with individual controls</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./two_tables_identical.html\">Two tables with shared controls</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./restore.html\">Restore / show all</a></li>\n\t\t\t\t\t\t\t<li><a href=\"./jqueryui.html\">jQuery UI styling</a></li>\n\t\t\t\t\t\t</ul>\n\t\t\t\t\t</div>\n\t\t\t\t</div>\n\n\t\t\t\t<div class=\"epilogue\">\n\t\t\t\t\t<p>Please refer to the <a href=\"http://www.datatables.net\">DataTables documentation</a> for full information about its API properties and methods.<br>\n\t\t\t\t\tAdditionally, there are a wide range of <a href=\"http://www.datatables.net/extras\">extras</a> and <a href=\"http://www.datatables.net/plug-ins\">plug-ins</a>\n\t\t\t\t\twhich extend the capabilities of DataTables.</p>\n\n\t\t\t\t\t<p class=\"copyright\">DataTables designed and created by <a href=\"http://www.sprymedia.co.uk\">SpryMedia Ltd</a> &#169; 2007-2015<br>\n\t\t\t\t\tDataTables is licensed under the <a href=\"http://www.datatables.net/mit\">MIT license</a>.</p>\n\t\t\t\t</div>\n\t\t\t</div>\n\t\t</div>\n\t</section>\n</body>\n</html>"], ["framework 'Cocoa'\n\nclass CustomView < NSView\n\n def drawRect(rect)\n NSColor.whiteColor.set\n NSBezierPath.fillRect(rect)\n img_url = NSURL.URLWithString('http://bit.ly/apple_logo_png')\n img = NSImage.alloc.initWithContentsOfURL(img_url)\n img.drawAtPoint([0,0], fromRect: NSZeroRect, operation: NSCompositeSourceOver, fraction: 1)\n end\n \nend\n\napplication = NSApplication.sharedApplication\n\n# create the window\nframe = [100, 100, 152, 186]\nmask = NSTitledWindowMask | NSClosableWindowMask\nwindow = NSWindow.alloc.initWithContentRect(frame,\n styleMask:mask,\n backing:NSBackingStoreBuffered,\n defer:false)\n\n# assign a content view instance\ncontent_view = CustomView.alloc.initWithFrame(frame)\nwindow.contentView = content_view\n\n# show the window\nwindow.display\nwindow.makeKeyAndOrderFront(nil)\nwindow.orderFrontRegardless\n\napplication.run"], [".TH trick-killsim 1 \"August 1, 2016\" \"Trick\" \"Trick User's Manual\"\n.SH NAME\ntrick-killsim \\- Kill Trick simulations\n.SH SYNOPSIS\n\\fBtrick-killsim\\fP\n.SH DESCRIPTION\n\\fBtrick-killsim\\fP is a bourne shell script whick kills all Trick simulation processes for\na user.\n.SH \"SEE ALSO\"\nAll Trick model developers and users should go through the tutorial found\nin the \\fITrick Simulation Environment User Training Materials\\fP.\nThe canonical reference for all Trick commands, files and utilities is the\n\\fITrick Simulation Environment User's Guide\\fP. Information specific to a\ngiven release of Trick tools is contained in the \\fITrick Simulation\nEnvironment Version Description\\fP for that release.\n.SH HISTORY\n1997-present : \\fBtrick-killsim\\fP was written by Greg Alexander\n\n"], ["/*\n * Copyright (C) 2015, Bin Meng <bmeng.cn@gmail.com>\n *\n * SPDX-License-Identifier:\tGPL-2.0+\n */\n\n#include <common.h>\n#include <dm.h>\n#include <errno.h>\n#include <fdtdec.h>\n#include <malloc.h>\n#include <asm/io.h>\n#include <asm/irq.h>\n#include <asm/pci.h>\n#include <asm/pirq_routing.h>\n#include <asm/tables.h>\n\nDECLARE_GLOBAL_DATA_PTR;\n\nbool pirq_check_irq_routed(struct udevice *dev, int link, u8 irq)\n{\n\tstruct irq_router *priv = dev_get_priv(dev);\n\tu8 pirq;\n\tint base = priv->link_base;\n\n\tif (priv->config == PIRQ_VIA_PCI)\n\t\tdm_pci_read_config8(dev->parent, LINK_N2V(link, base), &pirq);\n\telse\n\t\tpirq = readb((uintptr_t)priv->ibase + LINK_N2V(link, base));\n\n\tpirq &= 0xf;\n\n\t/* IRQ# 0/1/2/8/13 are reserved */\n\tif (pirq < 3 || pirq == 8 || pirq == 13)\n\t\treturn false;\n\n\treturn pirq == irq ? true : false;\n}\n\nint pirq_translate_link(struct udevice *dev, int link)\n{\n\tstruct irq_router *priv = dev_get_priv(dev);\n\n\treturn LINK_V2N(link, priv->link_base);\n}\n\nvoid pirq_assign_irq(struct udevice *dev, int link, u8 irq)\n{\n\tstruct irq_router *priv = dev_get_priv(dev);\n\tint base = priv->link_base;\n\n\t/* IRQ# 0/1/2/8/13 are reserved */\n\tif (irq < 3 || irq == 8 || irq == 13)\n\t\treturn;\n\n\tif (priv->config == PIRQ_VIA_PCI)\n\t\tdm_pci_write_config8(dev->parent, LINK_N2V(link, base), irq);\n\telse\n\t\twriteb(irq, (uintptr_t)priv->ibase + LINK_N2V(link, base));\n}\n\nstatic struct irq_info *check_dup_entry(struct irq_info *slot_base,\n\t\t\t\t\tint entry_num, int bus, int device)\n{\n\tstruct irq_info *slot = slot_base;\n\tint i;\n\n\tfor (i = 0; i < entry_num; i++) {\n\t\tif (slot->bus == bus && slot->devfn == (device << 3))\n\t\t\tbreak;\n\t\tslot++;\n\t}\n\n\treturn (i == entry_num) ? NULL : slot;\n}\n\nstatic inline void fill_irq_info(struct irq_router *priv, struct irq_info *slot,\n\t\t\t\t int bus, int device, int pin, int pirq)\n{\n\tslot->bus = bus;\n\tslot->devfn = (device << 3) | 0;\n\tslot->irq[pin - 1].link = LINK_N2V(pirq, priv->link_base);\n\tslot->irq[pin - 1].bitmap = priv->irq_mask;\n}\n\nstatic int create_pirq_routing_table(struct udevice *dev)\n{\n\tstruct irq_router *priv = dev_get_priv(dev);\n\tconst void *blob = gd->fdt_blob;\n\tint node;\n\tint len, count;\n\tconst u32 *cell;\n\tstruct irq_routing_table *rt;\n\tstruct irq_info *slot, *slot_base;\n\tint irq_entries = 0;\n\tint i;\n\tint ret;\n\n\tnode = dev_of_offset(dev);\n\n\t/* extract the bdf from fdt_pci_addr */\n\tpriv->bdf = dm_pci_get_bdf(dev->parent);\n\n\tret = fdt_stringlist_search(blob, node, \"intel,pirq-config\", \"pci\");\n\tif (!ret) {\n\t\tpriv->config = PIRQ_VIA_PCI;\n\t} else {\n\t\tret = fdt_stringlist_search(blob, node, \"intel,pirq-config\",\n\t\t\t\t\t \"ibase\");\n\t\tif (!ret)\n\t\t\tpriv->config = PIRQ_VIA_IBASE;\n\t\telse\n\t\t\treturn -EINVAL;\n\t}\n\n\tret = fdtdec_get_int(blob, node, \"intel,pirq-link\", -1);\n\tif (ret == -1)\n\t\treturn ret;\n\tpriv->link_base = ret;\n\n\tpriv->irq_mask = fdtdec_get_int(blob, node,\n\t\t\t\t\t\"intel,pirq-mask\", PIRQ_BITMAP);\n\n\tif (IS_ENABLED(CONFIG_GENERATE_ACPI_TABLE)) {\n\t\t/* Reserve IRQ9 for SCI */\n\t\tpriv->irq_mask &= ~(1 << 9);\n\t}\n\n\tif (priv->config == PIRQ_VIA_IBASE) {\n\t\tint ibase_off;\n\n\t\tibase_off = fdtdec_get_int(blob, node, \"intel,ibase-offset\", 0);\n\t\tif (!ibase_off)\n\t\t\treturn -EINVAL;\n\n\t\t/*\n\t\t * Here we assume that the IBASE register has already been\n\t\t * properly configured by U-Boot before.\n\t\t *\n\t\t * By 'valid' we mean:\n\t\t * 1) a valid memory space carved within system memory space\n\t\t * assigned to IBASE register block.\n\t\t * 2) memory range decoding is enabled.\n\t\t * Hence we don't do any santify test here.\n\t\t */\n\t\tdm_pci_read_config32(dev->parent, ibase_off, &priv->ibase);\n\t\tpriv->ibase &= ~0xf;\n\t}\n\n\tpriv->actl_8bit = fdtdec_get_bool(blob, node, \"intel,actl-8bit\");\n\tpriv->actl_addr = fdtdec_get_int(blob, node, \"intel,actl-addr\", 0);\n\n\tcell = fdt_getprop(blob, node, \"intel,pirq-routing\", &len);\n\tif (!cell || len % sizeof(struct pirq_routing))\n\t\treturn -EINVAL;\n\tcount = len / sizeof(struct pirq_routing);\n\n\trt = calloc(1, sizeof(struct irq_routing_table));\n\tif (!rt)\n\t\treturn -ENOMEM;\n\n\t/* Populate the PIRQ table fields */\n\trt->signature = PIRQ_SIGNATURE;\n\trt->version = PIRQ_VERSION;\n\trt->rtr_bus = PCI_BUS(priv->bdf);\n\trt->rtr_devfn = (PCI_DEV(priv->bdf) << 3) | PCI_FUNC(priv->bdf);\n\trt->rtr_vendor = PCI_VENDOR_ID_INTEL;\n\trt->rtr_device = PCI_DEVICE_ID_INTEL_ICH7_31;\n\n\tslot_base = rt->slots;\n\n\t/* Now fill in the irq_info entries in the PIRQ table */\n\tfor (i = 0; i < count;\n\t i++, cell += sizeof(struct pirq_routing) / sizeof(u32)) {\n\t\tstruct pirq_routing pr;\n\n\t\tpr.bdf = fdt_addr_to_cpu(cell[0]);\n\t\tpr.pin = fdt_addr_to_cpu(cell[1]);\n\t\tpr.pirq = fdt_addr_to_cpu(cell[2]);\n\n\t\tdebug(\"irq_info %d: b.d.f %x.%x.%x INT%c PIRQ%c\\n\",\n\t\t i, PCI_BUS(pr.bdf), PCI_DEV(pr.bdf),\n\t\t PCI_FUNC(pr.bdf), 'A' + pr.pin - 1,\n\t\t 'A' + pr.pirq);\n\n\t\tslot = check_dup_entry(slot_base, irq_entries,\n\t\t\t\t PCI_BUS(pr.bdf), PCI_DEV(pr.bdf));\n\t\tif (slot) {\n\t\t\tdebug(\"found entry for bus %d device %d, \",\n\t\t\t PCI_BUS(pr.bdf), PCI_DEV(pr.bdf));\n\n\t\t\tif (slot->irq[pr.pin - 1].link) {\n\t\t\t\tdebug(\"skipping\\n\");\n\n\t\t\t\t/*\n\t\t\t\t * Sanity test on the routed PIRQ pin\n\t\t\t\t *\n\t\t\t\t * If they don't match, show a warning to tell\n\t\t\t\t * there might be something wrong with the PIRQ\n\t\t\t\t * routing information in the device tree.\n\t\t\t\t */\n\t\t\t\tif (slot->irq[pr.pin - 1].link !=\n\t\t\t\t\tLINK_N2V(pr.pirq, priv->link_base))\n\t\t\t\t\tdebug(\"WARNING: Inconsistent PIRQ routing information\\n\");\n\t\t\t\tcontinue;\n\t\t\t}\n\t\t} else {\n\t\t\tslot = slot_base + irq_entries++;\n\t\t}\n\t\tdebug(\"writing INT%c\\n\", 'A' + pr.pin - 1);\n\t\tfill_irq_info(priv, slot, PCI_BUS(pr.bdf), PCI_DEV(pr.bdf),\n\t\t\t pr.pin, pr.pirq);\n\t}\n\n\trt->size = irq_entries * sizeof(struct irq_info) + 32;\n\n\t/* Fix up the table checksum */\n\trt->checksum = table_compute_checksum(rt, rt->size);\n\n\tgd->arch.pirq_routing_table = rt;\n\n\treturn 0;\n}\n\nstatic void irq_enable_sci(struct udevice *dev)\n{\n\tstruct irq_router *priv = dev_get_priv(dev);\n\n\tif (priv->actl_8bit) {\n\t\t/* Bit7 must be turned on to enable ACPI */\n\t\tdm_pci_write_config8(dev->parent, priv->actl_addr, 0x80);\n\t} else {\n\t\t/* Write 0 to enable SCI on IRQ9 */\n\t\tif (priv->config == PIRQ_VIA_PCI)\n\t\t\tdm_pci_write_config32(dev->parent, priv->actl_addr, 0);\n\t\telse\n\t\t\twritel(0, (uintptr_t)priv->ibase + priv->actl_addr);\n\t}\n}\n\nint irq_router_common_init(struct udevice *dev)\n{\n\tint ret;\n\n\tret = create_pirq_routing_table(dev);\n\tif (ret) {\n\t\tdebug(\"Failed to create pirq routing table\\n\");\n\t\treturn ret;\n\t}\n\t/* Route PIRQ */\n\tpirq_route_irqs(dev, gd->arch.pirq_routing_table->slots,\n\t\t\tget_irq_slot_count(gd->arch.pirq_routing_table));\n\n\tif (IS_ENABLED(CONFIG_GENERATE_ACPI_TABLE))\n\t\tirq_enable_sci(dev);\n\n\treturn 0;\n}\n\nint irq_router_probe(struct udevice *dev)\n{\n\treturn irq_router_common_init(dev);\n}\n\nulong write_pirq_routing_table(ulong addr)\n{\n\tif (!gd->arch.pirq_routing_table)\n\t\treturn addr;\n\n\treturn copy_pirq_routing_table(addr, gd->arch.pirq_routing_table);\n}\n\nstatic const struct udevice_id irq_router_ids[] = {\n\t{ .compatible = \"intel,irq-router\" },\n\t{ }\n};\n\nU_BOOT_DRIVER(irq_router_drv) = {\n\t.name\t\t= \"intel_irq\",\n\t.id\t\t= UCLASS_IRQ,\n\t.of_match\t= irq_router_ids,\n\t.probe\t\t= irq_router_probe,\n\t.priv_auto_alloc_size = sizeof(struct irq_router),\n};\n\nUCLASS_DRIVER(irq) = {\n\t.id\t\t= UCLASS_IRQ,\n\t.name\t\t= \"irq\",\n};\n"], ["{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {\n \"colab_type\": \"text\",\n \"id\": \"view-in-github\"\n },\n \"source\": [\n \"<a href=\\\"https://colab.research.google.com/github/Tessellate-Imaging/monk_v1/blob/master/study_roadmaps/1_getting_started_roadmap/6_hyperparameter_tuning/1)%20Analyse%20Learning%20Rates.ipynb\\\" target=\\\"_parent\\\"><img src=\\\"https://colab.research.google.com/assets/colab-badge.svg\\\" alt=\\\"Open In Colab\\\"/></a>\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"# Goals\\n\",\n \"\\n\",\n \"\\n\",\n \"### Learn how to use hyper parameter analyser for learning rates\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"# Table of Contents\\n\",\n \"\\n\",\n \"\\n\",\n \"## [Install](#0)\\n\",\n \"\\n\",\n \"\\n\",\n \"## [Load experiment in default mode](#1)\\n\",\n \"\\n\",\n \"\\n\",\n \"## [Run Analyser](#2)\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"<a id='0'></a>\\n\",\n \"# Install Monk\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Using pip (Recommended)\\n\",\n \"\\n\",\n \" - colab (gpu) \\n\",\n \" - All bakcends: `pip install -U monk-colab`\\n\",\n \" \\n\",\n \"\\n\",\n \" - kaggle (gpu) \\n\",\n \" - All backends: `pip install -U monk-kaggle`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cuda 10.2\\t\\n\",\n \" - All backends: `pip install -U monk-cuda102`\\n\",\n \" - Gluon bakcned: `pip install -U monk-gluon-cuda102`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cuda102`\\n\",\n \" - Keras backend: `pip install -U monk-keras-cuda102`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cuda 10.1\\t\\n\",\n \" - All backend: `pip install -U monk-cuda101`\\n\",\n \"\\t - Gluon bakcned: `pip install -U monk-gluon-cuda101`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cuda101`\\n\",\n \"\\t - Keras backend: `pip install -U monk-keras-cuda101`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cuda 10.0\\t\\n\",\n \" - All backend: `pip install -U monk-cuda100`\\n\",\n \"\\t - Gluon bakcned: `pip install -U monk-gluon-cuda100`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cuda100`\\n\",\n \"\\t - Keras backend: `pip install -U monk-keras-cuda100`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cuda 9.2\\t\\n\",\n \" - All backend: `pip install -U monk-cuda92`\\n\",\n \"\\t - Gluon bakcned: `pip install -U monk-gluon-cuda92`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cuda92`\\n\",\n \"\\t - Keras backend: `pip install -U monk-keras-cuda92`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cuda 9.0\\t\\n\",\n \" - All backend: `pip install -U monk-cuda90`\\n\",\n \"\\t - Gluon bakcned: `pip install -U monk-gluon-cuda90`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cuda90`\\n\",\n \"\\t - Keras backend: `pip install -U monk-keras-cuda90`\\n\",\n \" \\n\",\n \"\\n\",\n \" - cpu \\t\\t\\n\",\n \" - All backend: `pip install -U monk-cpu`\\n\",\n \"\\t - Gluon bakcned: `pip install -U monk-gluon-cpu`\\n\",\n \"\\t - Pytorch backend: `pip install -U monk-pytorch-cpu`\\n\",\n \"\\t - Keras backend: `pip install -U monk-keras-cpu`\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Install Monk Manually (Not recommended)\\n\",\n \" \\n\",\n \"### Step 1: Clone the library\\n\",\n \" - git clone https://github.com/Tessellate-Imaging/monk_v1.git\\n\",\n \" \\n\",\n \" \\n\",\n \" \\n\",\n \" \\n\",\n \"### Step 2: Install requirements \\n\",\n \" - Linux\\n\",\n \" - Cuda 9.0\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cu90.txt`\\n\",\n \" - Cuda 9.2\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cu92.txt`\\n\",\n \" - Cuda 10.0\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cu100.txt`\\n\",\n \" - Cuda 10.1\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cu101.txt`\\n\",\n \" - Cuda 10.2\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cu102.txt`\\n\",\n \" - CPU (Non gpu system)\\n\",\n \" - `cd monk_v1/installation/Linux && pip install -r requirements_cpu.txt`\\n\",\n \" \\n\",\n \" \\n\",\n \" - Windows\\n\",\n \" - Cuda 9.0 (Experimental support)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cu90.txt`\\n\",\n \" - Cuda 9.2 (Experimental support)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cu92.txt`\\n\",\n \" - Cuda 10.0 (Experimental support)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cu100.txt`\\n\",\n \" - Cuda 10.1 (Experimental support)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cu101.txt`\\n\",\n \" - Cuda 10.2 (Experimental support)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cu102.txt`\\n\",\n \" - CPU (Non gpu system)\\n\",\n \" - `cd monk_v1/installation/Windows && pip install -r requirements_cpu.txt`\\n\",\n \" \\n\",\n \" \\n\",\n \" - Mac\\n\",\n \" - CPU (Non gpu system)\\n\",\n \" - `cd monk_v1/installation/Mac && pip install -r requirements_cpu.txt`\\n\",\n \" \\n\",\n \" \\n\",\n \" - Misc\\n\",\n \" - Colab (GPU)\\n\",\n \" - `cd monk_v1/installation/Misc && pip install -r requirements_colab.txt`\\n\",\n \" - Kaggle (GPU)\\n\",\n \" - `cd monk_v1/installation/Misc && pip install -r requirements_kaggle.txt`\\n\",\n \" \\n\",\n \" \\n\",\n \" \\n\",\n \"### Step 3: Add to system path (Required for every terminal or kernel run)\\n\",\n \" - `import sys`\\n\",\n \" - `sys.path.append(\\\"monk_v1/\\\");`\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Dataset - Caltech-256\\n\",\n \" - https://www.kaggle.com/jessicali9530/caltech256\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"! wget --load-cookies /tmp/cookies.txt \\\"https://docs.google.com/uc?export=download&confirm=$(wget --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=1Lltrl2U4L8WJkyBjMBFHSaoK8dLhoItl' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\\\\1\\\\n/p')&id=1Lltrl2U4L8WJkyBjMBFHSaoK8dLhoItl\\\" -O caltech256.zip && rm -rf /tmp/cookies.txt\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 3,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"! unzip -qq caltech256.zip\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"# Imports\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 1,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"#Using gluon backend \\n\",\n \"\\n\",\n \"# When installed using pip\\n\",\n \"from monk.gluon_prototype import prototype\\n\",\n \"\\n\",\n \"\\n\",\n \"# When installed manually (Uncomment the following)\\n\",\n \"#import os\\n\",\n \"#import sys\\n\",\n \"#sys.path.append(\\\"monk_v1/\\\");\\n\",\n \"#sys.path.append(\\\"monk_v1/monk/\\\");\\n\",\n \"#from monk.gluon_prototype import prototype\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"<a id='1'></a>\\n\",\n \"# Load experiment in default mode\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 3,\n \"metadata\": {},\n \"outputs\": [\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Mxnet Version: 1.5.0\\n\",\n \"\\n\",\n \"Experiment Details\\n\",\n \" Project: Project\\n\",\n \" Experiment: analyser_lr\\n\",\n \" Dir: /home/abhi/Desktop/Work/tess_tool/gui/v0.3/finetune_models/Organization/development/v5.3_roadmaps/1_getting_started_roadmap/6_hyperparameter_tuning/workspace/Project/analyser_lr/\\n\",\n \"\\n\"\n ]\n }\n ],\n \"source\": [\n \"gtf = prototype(verbose=1);\\n\",\n \"gtf.Prototype(\\\"Project\\\", \\\"analyser_lr\\\");\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 4,\n \"metadata\": {},\n \"outputs\": [\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Dataset Details\\n\",\n \" Train path: caltech256/train\\n\",\n \" Val path: None\\n\",\n \" CSV train path: None\\n\",\n \" CSV val path: None\\n\",\n \"\\n\",\n \"Dataset Params\\n\",\n \" Input Size: 224\\n\",\n \" Batch Size: 4\\n\",\n \" Data Shuffle: True\\n\",\n \" Processors: 4\\n\",\n \" Train-val split: 0.7\\n\",\n \"\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Pre-Composed Train Transforms\\n\",\n \"[{'RandomHorizontalFlip': {'p': 0.8}}, {'Normalize': {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}}]\\n\",\n \"\\n\",\n \"Pre-Composed Val Transforms\\n\",\n \"[{'RandomHorizontalFlip': {'p': 0.8}}, {'Normalize': {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}}]\\n\",\n \"\\n\",\n \"Dataset Numbers\\n\",\n \" Num train images: 21424\\n\",\n \" Num val images: 9183\\n\",\n \" Num classes: 257\\n\",\n \"\\n\",\n \"Model Params\\n\",\n \" Model name: resnet18_v1\\n\",\n \" Use Gpu: True\\n\",\n \" Use pretrained: True\\n\",\n \" Freeze base network: False\\n\",\n \"\\n\",\n \"Model Details\\n\",\n \" Loading pretrained model\\n\",\n \" Model Loaded on device\\n\",\n \" Model name: resnet18_v1\\n\",\n \" Num of potentially trainable layers: 41\\n\",\n \" Num of actual trainable layers: 41\\n\",\n \"\\n\",\n \"Optimizer\\n\",\n \" Name: sgd\\n\",\n \" Learning rate: 0.01\\n\",\n \" Params: {'lr': 0.01, 'momentum': 0, 'weight_decay': 0, 'momentum_dampening_rate': 0, 'clipnorm': 0.0, 'clipvalue': 0.0}\\n\",\n \"\\n\",\n \"\\n\",\n \"\\n\",\n \"Learning rate scheduler\\n\",\n \" Name: steplr\\n\",\n \" Params: {'step_size': 1, 'gamma': 0.98, 'last_epoch': -1}\\n\",\n \"\\n\",\n \"Loss\\n\",\n \" Name: softmaxcrossentropy\\n\",\n \" Params: {'weight': None, 'batch_axis': 0, 'axis_to_sum_over': -1, 'label_as_categories': True, 'label_smoothing': False}\\n\",\n \"\\n\",\n \"Training params\\n\",\n \" Num Epochs: 5\\n\",\n \"\\n\",\n \"Display params\\n\",\n \" Display progress: True\\n\",\n \" Display progress realtime: True\\n\",\n \" Save Training logs: True\\n\",\n \" Save Intermediate models: True\\n\",\n \" Intermediate model prefix: intermediate_model_\\n\",\n \"\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"monk_v1/monk/system/imports.py:160: UserWarning: ArgumentWarning: clipnorm and clipvalue are active only for keras in current version of Monk\\n\",\n \" warnings.warn(msg)\\n\",\n \"monk_v1/monk/system/imports.py:160: UserWarning: ArgumentWarning: momentum_dampening_rate is active only for pytorch in current version of Monk\\n\",\n \" warnings.warn(msg)\\n\"\n ]\n }\n ],\n \"source\": [\n \"gtf.Default(dataset_path=\\\"caltech256/train\\\", \\n\",\n \" model_name=\\\"resnet18_v1\\\", \\n\",\n \" freeze_base_network=False,\\n\",\n \" num_epochs=5);\\n\",\n \"\\n\",\n \"#Read the summary generated once you run this cell. \"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"<a id='2'></a>\\n\",\n \"# Analyse Learning Rates\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 5,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"# Analysis Project Name\\n\",\n \"analysis_name = \\\"analyse_learning_rates\\\"\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 6,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"# Learning rates to explore\\n\",\n \"lrs = [0.1, 0.05, 0.01, 0.005, 0.0001];\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 7,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"# Num epochs for each sub-experiment to run\\n\",\n \"epochs=10\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 8,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"# Percentage of original dataset to take in for experimentation\\n\",\n \"percent_data=10\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 9,\n \"metadata\": {},\n \"outputs\": [\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"\\n\",\n \"Running Learning rate analysis\\n\",\n \"Analysis Name : analyse_learning_rates\\n\",\n \"\\n\",\n \"Running experiment : 1/5\\n\",\n \"Experiment name : Learning_Rate_0.1\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Estimated time : 3 min\\n\",\n \"Experiment Complete\\n\",\n \"\\n\",\n \"\\n\",\n \"Running experiment : 2/5\\n\",\n \"Experiment name : Learning_Rate_0.05\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Estimated time : 3 min\\n\",\n \"Experiment Complete\\n\",\n \"\\n\",\n \"\\n\",\n \"Running experiment : 3/5\\n\",\n \"Experiment name : Learning_Rate_0.01\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Estimated time : 3 min\\n\",\n \"Experiment Complete\\n\",\n \"\\n\",\n \"\\n\",\n \"Running experiment : 4/5\\n\",\n \"Experiment name : Learning_Rate_0.005\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Estimated time : 3 min\\n\",\n \"Experiment Complete\\n\",\n \"\\n\",\n \"\\n\",\n \"Running experiment : 5/5\\n\",\n \"Experiment name : Learning_Rate_0.0001\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Estimated time : 3 min\\n\",\n \"Experiment Complete\\n\",\n \"\\n\",\n \"\\n\",\n \"Comparing Experiments\\n\",\n \"Comparison ID: Comparison_analyse_learning_rates\\n\",\n \"Generated statistics post all epochs\\n\",\n \"| Experiment Name | Train Acc | Val Acc | Train Loss | Val Loss |\\n\",\n \"|----------------------+-------------+-----------+--------------+------------|\\n\",\n \"| Learning_Rate_0.1 | 0.0746437 | 0.0639731 | 5.10316 | 5.27679 |\\n\",\n \"| Learning_Rate_0.05 | 0.358965 | 0.23569 | 2.78242 | 3.77081 |\\n\",\n \"| Learning_Rate_0.01 | 0.978995 | 0.47138 | 0.2577 | 2.3881 |\\n\",\n \"| Learning_Rate_0.005 | 0.945611 | 0.491582 | 0.599715 | 2.53947 |\\n\",\n \"| Learning_Rate_0.0001 | 0.0982746 | 0.0808081 | 5.119 | 5.32924 |\\n\",\n \"\\n\"\n ]\n },\n {\n \"data\": {\n \"text/plain\": [\n \"<Figure size 432x288 with 0 Axes>\"\n ]\n },\n \"metadata\": {},\n \"output_type\": \"display_data\"\n },\n {\n \"data\": {\n \"text/plain\": [\n \"<Figure size 1440x720 with 0 Axes>\"\n ]\n },\n \"metadata\": {},\n \"output_type\": \"display_data\"\n },\n {\n \"data\": {\n \"text/plain\": [\n \"<Figure size 1440x720 with 0 Axes>\"\n ]\n },\n \"metadata\": {},\n \"output_type\": \"display_data\"\n },\n {\n \"data\": {\n \"text/plain\": [\n \"<Figure size 1440x720 with 0 Axes>\"\n ]\n },\n \"metadata\": {},\n \"output_type\": \"display_data\"\n },\n {\n \"data\": {\n \"text/plain\": [\n \"<Figure size 1440x720 with 0 Axes>\"\n ]\n },\n \"metadata\": {},\n \"output_type\": \"display_data\"\n }\n ],\n \"source\": [\n \"# \\\"keep_all\\\" - Keep all the sub experiments created\\n\",\n \"# \\\"keep_non\\\" - Delete all sub experiments created\\n\",\n \"analysis = gtf.Analyse_Learning_Rates(analysis_name, lrs, percent_data, \\n\",\n \" num_epochs=epochs, state=\\\"keep_none\\\"); \"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Analysis\\n\",\n \"\\n\",\n \" - LR as 0.1 doesnt work\\n\",\n \" - Same is the case with 0.0001\\n\",\n \" \\n\",\n \" - Of the other's lr as 0.01 produces least validation loss\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Update learning rate\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 10,\n \"metadata\": {},\n \"outputs\": [\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Update: Learning Rate - 0.01\\n\",\n \"\\n\"\n ]\n },\n {\n \"name\": \"stderr\",\n \"output_type\": \"stream\",\n \"text\": [\n \"/home/abhi/.virtualenvs/finetune_py36/lib/python3.6/site-packages/mxnet/gluon/data/vision/datasets.py:312: UserWarning: Ignoring caltech256/train/198.spider/RENAME2 of type . Only support .jpg, .jpeg, .png\\n\",\n \" filename, ext, ', '.join(self._exts)))\\n\"\n ]\n },\n {\n \"name\": \"stdout\",\n \"output_type\": \"stream\",\n \"text\": [\n \"Pre-Composed Train Transforms\\n\",\n \"[{'RandomHorizontalFlip': {'p': 0.8}}, {'Normalize': {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}}]\\n\",\n \"\\n\",\n \"Pre-Composed Val Transforms\\n\",\n \"[{'RandomHorizontalFlip': {'p': 0.8}}, {'Normalize': {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}}]\\n\",\n \"\\n\",\n \"Dataset Numbers\\n\",\n \" Num train images: 21424\\n\",\n \" Num val images: 9183\\n\",\n \" Num classes: 257\\n\",\n \"\\n\",\n \"Model Details\\n\",\n \" Loading pretrained model\\n\",\n \" Model Loaded on device\\n\",\n \" Model name: resnet18_v1\\n\",\n \" Num of potentially trainable layers: 41\\n\",\n \" Num of actual trainable layers: 41\\n\",\n \"\\n\"\n ]\n }\n ],\n \"source\": [\n \"gtf.update_learning_rate(0.01);\\n\",\n \"\\n\",\n \"# Very important to reload post updates\\n\",\n \"gtf.Reload();\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"#Start Training\\n\",\n \"gtf.Train();\\n\",\n \"\\n\",\n \"#Read the training summary generated once you run the cell and training is completed\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"# Goals Completed\\n\",\n \"\\n\",\n \"### Learn how to use hyper parameter analyser for learning rates\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": []\n }\n ],\n \"metadata\": {\n \"kernelspec\": {\n \"display_name\": \"Python 3\",\n \"language\": \"python\",\n \"name\": \"python3\"\n },\n \"language_info\": {\n \"codemirror_mode\": {\n \"name\": \"ipython\",\n \"version\": 3\n },\n \"file_extension\": \".py\",\n \"mimetype\": \"text/x-python\",\n \"name\": \"python\",\n \"nbconvert_exporter\": \"python\",\n \"pygments_lexer\": \"ipython3\",\n \"version\": \"3.6.9\"\n }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"], ["// Copyright 2013 the V8 project authors. All rights reserved.\n// Use of this source code is governed by a BSD-style license that can be\n// found in the LICENSE file.\n\n#ifndef V8_CRANKSHAFT_HYDROGEN_OSR_H_\n#define V8_CRANKSHAFT_HYDROGEN_OSR_H_\n\n#include \"src/ast.h\"\n#include \"src/crankshaft/hydrogen.h\"\n#include \"src/zone.h\"\n\nnamespace v8 {\nnamespace internal {\n\n// Responsible for building graph parts related to OSR and otherwise\n// setting up the graph to do an OSR compile.\nclass HOsrBuilder : public ZoneObject {\n public:\n explicit HOsrBuilder(HOptimizedGraphBuilder* builder)\n : unoptimized_frame_slots_(0),\n builder_(builder),\n osr_entry_(NULL),\n osr_loop_entry_(NULL),\n osr_values_(NULL) { }\n\n // Creates the loop entry block for the given statement, setting up OSR\n // entries as necessary, and sets the current block to the new block.\n HBasicBlock* BuildOsrLoopEntry(IterationStatement* statement);\n\n // Process the hydrogen graph after it has been completed, performing\n // any OSR-specific cleanups or changes.\n void FinishGraph();\n\n // Process the OSR values and phis after initial graph optimization.\n void FinishOsrValues();\n\n // Return the number of slots in the unoptimized frame at the entry to OSR.\n int UnoptimizedFrameSlots() const {\n return unoptimized_frame_slots_;\n }\n\n bool HasOsrEntryAt(IterationStatement* statement);\n\n private:\n int unoptimized_frame_slots_;\n HOptimizedGraphBuilder* builder_;\n HBasicBlock* osr_entry_;\n HBasicBlock* osr_loop_entry_;\n ZoneList<HUnknownOSRValue*>* osr_values_;\n};\n\n} // namespace internal\n} // namespace v8\n\n#endif // V8_CRANKSHAFT_HYDROGEN_OSR_H_\n"], ["<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n<!DOCTYPE ldml SYSTEM \"../../common/dtd/ldml.dtd\">\n<!-- Copyright \u00a9 1991-2020 Unicode, Inc.\nFor terms of use, see http://www.unicode.org/copyright.html\nUnicode and the Unicode Logo are registered trademarks of Unicode, Inc. in the U.S. and other countries.\nCLDR data files are interpreted according to the LDML specification (http://unicode.org/reports/tr35/)\n-->\n<ldml>\n\t<identity>\n\t\t<version number=\"$Revision$\"/>\n\t\t<language type=\"yi\"/>\n\t\t<territory type=\"001\"/>\n\t</identity>\n</ldml>\n"], ["# -*- Mode: Java; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*-\n# ***** BEGIN LICENSE BLOCK *****\n# Version: MPL 1.1/GPL 2.0/LGPL 2.1\n#\n# The contents of this file are subject to the Mozilla Public License Version\n# 1.1 (the \"License\"); you may not use this file except in compliance with\n# the License. You may obtain a copy of the License at\n# http://www.mozilla.org/MPL/\n#\n# Software distributed under the License is distributed on an \"AS IS\" basis,\n# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License\n# for the specific language governing rights and limitations under the\n# License.\n#\n# The Original Code is mozilla.org code.\n#\n# The Initial Developer of the Original Code is\n# Netscape Communications Corporation.\n# Portions created by the Initial Developer are Copyright (C) 1998\n# the Initial Developer. All Rights Reserved.\n#\n# Contributor(s):\n# Alec Flett <alecf@netscape.com> (original author of history.js)\n# Seth Spitzer <sspizer@mozilla.org> (port to Places)\n# Asaf Romano <mano@mozilla.com>\n#\n# Alternatively, the contents of this file may be used under the terms of\n# either the GNU General Public License Version 2 or later (the \"GPL\"), or\n# the GNU Lesser General Public License Version 2.1 or later (the \"LGPL\"),\n# in which case the provisions of the GPL or the LGPL are applicable instead\n# of those above. If you wish to allow use of your version of this file only\n# under the terms of either the GPL or the LGPL, and not to allow others to\n# use your version of this file under the terms of the MPL, indicate your\n# decision by deleting the provisions above and replace them with the notice\n# and other provisions required by the GPL or the LGPL. If you do not delete\n# the provisions above, a recipient may use your version of this file under\n# the terms of any one of the MPL, the GPL or the LGPL.\n#\n# ***** END LICENSE BLOCK *****\n\nvar gHistoryTree;\nvar gSearchBox;\nvar gHistoryGrouping = \"\";\nvar gSearching = false;\n\nfunction HistorySidebarInit()\n{\n gHistoryTree = document.getElementById(\"historyTree\");\n gSearchBox = document.getElementById(\"search-box\");\n\n gHistoryGrouping = document.getElementById(\"viewButton\").\n getAttribute(\"selectedsort\");\n\n if (gHistoryGrouping == \"site\")\n document.getElementById(\"bysite\").setAttribute(\"checked\", \"true\");\n else if (gHistoryGrouping == \"visited\") \n document.getElementById(\"byvisited\").setAttribute(\"checked\", \"true\");\n else if (gHistoryGrouping == \"lastvisited\")\n document.getElementById(\"bylastvisited\").setAttribute(\"checked\", \"true\");\n else if (gHistoryGrouping == \"dayandsite\")\n document.getElementById(\"bydayandsite\").setAttribute(\"checked\", \"true\");\n else\n document.getElementById(\"byday\").setAttribute(\"checked\", \"true\");\n \n searchHistory(\"\");\n}\n\nfunction GroupBy(groupingType)\n{\n gHistoryGrouping = groupingType;\n searchHistory(gSearchBox.value);\n}\n\nfunction searchHistory(aInput)\n{\n var query = PlacesUtils.history.getNewQuery();\n var options = PlacesUtils.history.getNewQueryOptions();\n\n const NHQO = Ci.nsINavHistoryQueryOptions;\n var sortingMode;\n var resultType;\n\n switch (gHistoryGrouping) {\n case \"visited\":\n resultType = NHQO.RESULTS_AS_URI;\n sortingMode = NHQO.SORT_BY_VISITCOUNT_DESCENDING;\n break; \n case \"lastvisited\":\n resultType = NHQO.RESULTS_AS_URI;\n sortingMode = NHQO.SORT_BY_DATE_DESCENDING;\n break; \n case \"dayandsite\":\n resultType = NHQO.RESULTS_AS_DATE_SITE_QUERY;\n break;\n case \"site\":\n resultType = NHQO.RESULTS_AS_SITE_QUERY;\n sortingMode = NHQO.SORT_BY_TITLE_ASCENDING;\n break;\n case \"day\":\n default:\n resultType = NHQO.RESULTS_AS_DATE_QUERY;\n break;\n }\n\n if (aInput) {\n query.searchTerms = aInput;\n if (gHistoryGrouping != \"visited\" && gHistoryGrouping != \"lastvisited\") {\n sortingMode = NHQO.SORT_BY_TITLE_ASCENDING;\n resultType = NHQO.RESULTS_AS_URI;\n }\n }\n\n options.sortingMode = sortingMode;\n options.resultType = resultType;\n\n // call load() on the tree manually\n // instead of setting the place attribute in history-panel.xul\n // otherwise, we will end up calling load() twice\n gHistoryTree.load([query], options);\n}\n\nwindow.addEventListener(\"SidebarFocused\",\n function()\n gSearchBox.focus(),\n false);\n"], ["package accounting\n\nimport (\n\t\"math/big\"\n\t\"testing\"\n\n\t\"github.com/cockroachdb/apd\"\n\t\"github.com/shopspring/decimal\"\n)\n\nfunc TestFormatNumber(t *testing.T) {\n\tAssertEqual(t, FormatNumber(123456789.213123, 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumber(123456789.213123, 3, \".\", \",\"), \"123.456.789,213\")\n\tAssertEqual(t, FormatNumber(-12345.123123, 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumber(-1234.123123, 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumber(-123.123123, 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumber(-12.123123, 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumber(-1.123123, 5, \",\", \".\"), \"-1.12312\")\n\tAssertEqual(t, FormatNumber(-1, 3, \",\", \".\"), \"-1.000\")\n\tAssertEqual(t, FormatNumber(-10, 3, \",\", \".\"), \"-10.000\")\n\tAssertEqual(t, FormatNumber(-100, 3, \",\", \".\"), \"-100.000\")\n\tAssertEqual(t, FormatNumber(-1000, 3, \",\", \".\"), \"-1,000.000\")\n\tAssertEqual(t, FormatNumber(-10000, 3, \",\", \".\"), \"-10,000.000\")\n\tAssertEqual(t, FormatNumber(-100000, 3, \",\", \".\"), \"-100,000.000\")\n\tAssertEqual(t, FormatNumber(-1000000, 3, \",\", \".\"), \"-1,000,000.000\")\n\tAssertEqual(t, FormatNumber(1, 3, \",\", \".\"), \"1.000\")\n\tAssertEqual(t, FormatNumber(10, 3, \",\", \".\"), \"10.000\")\n\tAssertEqual(t, FormatNumber(100, 3, \",\", \".\"), \"100.000\")\n\tAssertEqual(t, FormatNumber(1000, 3, \",\", \".\"), \"1,000.000\")\n\tAssertEqual(t, FormatNumber(10000, 3, \",\", \".\"), \"10,000.000\")\n\tAssertEqual(t, FormatNumber(100000, 3, \",\", \".\"), \"100,000.000\")\n\tAssertEqual(t, FormatNumber(1000000, 3, \",\", \".\"), \"1,000,000.000\")\n\tAssertEqual(t, FormatNumber(1000000, 10, \" \", \".\"), \"1 000 000.0000000000\")\n\tAssertEqual(t, FormatNumber(1000000, 10, \" \", \".\"), \"1 000 000.0000000000\")\n\tAssertEqual(t, FormatNumber(uint(1000000), 3, \",\", \".\"), \"1,000,000.000\")\n\n\tAssertEqual(t, FormatNumber(big.NewRat(77777777, 3), 3, \",\", \".\"), \"25,925,925.667\")\n\tAssertEqual(t, FormatNumber(big.NewRat(-77777777, 3), 3, \",\", \".\"), \"-25,925,925.667\")\n\tAssertEqual(t, FormatNumber(big.NewRat(-7777777, 3), 3, \",\", \".\"), \"-2,592,592.333\")\n\tAssertEqual(t, FormatNumber(big.NewRat(-777776, 3), 3, \",\", \".\"), \"-259,258.667\")\n\n\tAssertEqual(t, FormatNumber(apd.New(123456789213123, -6), 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumber(apd.New(-12345123123, -6), 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumber(apd.New(-1234123123, -6), 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumber(apd.New(-123123123, -6), 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumber(apd.New(-12123123, -6), 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumber(apd.New(-1123123, -6), 5, \",\", \".\"), \"-1.12312\")\n\n\td1 := decimal.New(123456789213123, -6)\n\td2 := decimal.New(-12345123123, -6)\n\td3 := decimal.New(-1234123123, -6)\n\td4 := decimal.New(-123123123, -6)\n\td5 := decimal.New(-12123123, -6)\n\td6 := decimal.New(-1123123, -6)\n\n\tAssertEqual(t, FormatNumber(d1, 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumber(d2, 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumber(d3, 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumber(d4, 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumber(d5, 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumber(d6, 5, \",\", \".\"), \"-1.12312\")\n\n\tAssertEqual(t, FormatNumber(&d1, 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumber(&d2, 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumber(&d3, 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumber(&d4, 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumber(&d5, 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumber(&d6, 5, \",\", \".\"), \"-1.12312\")\n\n\tfunc() {\n\t\tdefer func() {\n\t\t\trecover()\n\t\t}()\n\t\tFormatNumber(false, 3, \",\", \".\") // panic: Unsupported type - bool\n\t}()\n\tfunc() {\n\t\tdefer func() {\n\t\t\trecover()\n\t\t}()\n\t\tFormatNumber(big.NewInt(1), 3, \",\", \".\") // panic: Unsupported type - *big.Int\n\t}()\n\tfunc() {\n\t\ttype demo struct {\n\t\t\tValue int\n\t\t}\n\t\tdefer func() {\n\t\t\trecover()\n\t\t}()\n\t\tFormatNumber(demo{Value: 1}, 3, \",\", \".\") // panic: Unsupported type - *big.Int\n\t}()\n}\n\nfunc TestFormatNumberInt(t *testing.T) {\n\tAssertEqual(t, FormatNumberInt(-1, 3, \",\", \".\"), \"-1.000\")\n\tAssertEqual(t, FormatNumberInt(-10, 3, \",\", \".\"), \"-10.000\")\n\tAssertEqual(t, FormatNumberInt(-100, 3, \",\", \".\"), \"-100.000\")\n\tAssertEqual(t, FormatNumberInt(-1000, 3, \",\", \".\"), \"-1,000.000\")\n\tAssertEqual(t, FormatNumberInt(-10000, 3, \",\", \".\"), \"-10,000.000\")\n\tAssertEqual(t, FormatNumberInt(-100000, 3, \",\", \".\"), \"-100,000.000\")\n\tAssertEqual(t, FormatNumberInt(-1000000, 3, \",\", \".\"), \"-1,000,000.000\")\n\tAssertEqual(t, FormatNumberInt(1, 3, \",\", \".\"), \"1.000\")\n\tAssertEqual(t, FormatNumberInt(10, 3, \",\", \".\"), \"10.000\")\n\tAssertEqual(t, FormatNumberInt(100, 3, \",\", \".\"), \"100.000\")\n\tAssertEqual(t, FormatNumberInt(1000, 3, \",\", \".\"), \"1,000.000\")\n\tAssertEqual(t, FormatNumberInt(10000, 3, \",\", \".\"), \"10,000.000\")\n\tAssertEqual(t, FormatNumberInt(100000, 3, \",\", \".\"), \"100,000.000\")\n\tAssertEqual(t, FormatNumberInt(1000000, 3, \",\", \".\"), \"1,000,000.000\")\n\tAssertEqual(t, FormatNumberInt(1000000, 10, \" \", \".\"), \"1 000 000.0000000000\")\n\tAssertEqual(t, FormatNumberInt(1000000, 10, \" \", \".\"), \"1 000 000.0000000000\")\n}\n\nfunc TestFormatNumberFloat64(t *testing.T) {\n\tAssertEqual(t, FormatNumberFloat64(123456789.213123, 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumberFloat64(-12345.123123, 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumberFloat64(-1234.123123, 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumberFloat64(-123.123123, 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumberFloat64(-12.123123, 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumberFloat64(-1.123123, 5, \",\", \".\"), \"-1.12312\")\n}\n\nfunc TestFormatNumberBigRat(t *testing.T) {\n\tAssertEqual(t, FormatNumberBigRat(big.NewRat(77777777, 3), 3, \",\", \".\"), \"25,925,925.667\")\n\tAssertEqual(t, FormatNumberBigRat(big.NewRat(-77777777, 3), 3, \",\", \".\"), \"-25,925,925.667\")\n\tAssertEqual(t, FormatNumberBigRat(big.NewRat(-7777777, 3), 3, \",\", \".\"), \"-2,592,592.333\")\n\tAssertEqual(t, FormatNumberBigRat(big.NewRat(-777776, 3), 3, \",\", \".\"), \"-259,258.667\")\n}\n\nfunc TestFormatNumberBigDecimal(t *testing.T) {\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(123456789213123, -6), 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(-12345123123, -6), 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(-1234123123, -6), 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(-123123123, -6), 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(-12123123, -6), 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumberBigDecimal(apd.New(-1123123, -6), 5, \",\", \".\"), \"-1.12312\")\n}\n\nfunc TestFormatNumberDecimal(t *testing.T) {\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(123456789213123, -6), 3, \",\", \".\"), \"123,456,789.213\")\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(-12345123123, -6), 5, \",\", \".\"), \"-12,345.12312\")\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(-1234123123, -6), 5, \",\", \".\"), \"-1,234.12312\")\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(-123123123, -6), 5, \",\", \".\"), \"-123.12312\")\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(-12123123, -6), 5, \",\", \".\"), \"-12.12312\")\n\tAssertEqual(t, FormatNumberDecimal(decimal.New(-1123123, -6), 5, \",\", \".\"), \"-1.12312\")\n}\n"], ["/*\n * Copyright (c) 2018 THL A29 Limited, a Tencent company. All Rights Reserved.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing,\n * software distributed under the License is distributed on an\n * \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n * KIND, either express or implied. See the License for the\n * specific language governing permissions and limitations\n * under the License.\n */\n\nnamespace TencentCloud.Billing.V20180709.Models\n{\n using Newtonsoft.Json;\n using System.Collections.Generic;\n using TencentCloud.Common;\n\n public class BillTagInfo : AbstractModel\n {\n \n /// <summary>\n /// \u5206\u8d26\u6807\u7b7e\u952e\n /// </summary>\n [JsonProperty(\"TagKey\")]\n public string TagKey{ get; set; }\n\n /// <summary>\n /// \u6807\u7b7e\u503c\n /// </summary>\n [JsonProperty(\"TagValue\")]\n public string TagValue{ get; set; }\n\n\n /// <summary>\n /// For internal usage only. DO NOT USE IT.\n /// </summary>\n internal override void ToMap(Dictionary<string, string> map, string prefix)\n {\n this.SetParamSimple(map, prefix + \"TagKey\", this.TagKey);\n this.SetParamSimple(map, prefix + \"TagValue\", this.TagValue);\n }\n }\n}\n\n"]]
\ No newline at end of file
{"results": {"pile_github": {"word_perplexity": 1.0006622624493737, "byte_perplexity": 1.0000635880965316, "bits_per_byte": 6.358607489435376e-05}}, "versions": {"pile_github": 0}} {"results": {"pile_github": {"bits_per_byte": 9.540627613754646e-05, "byte_perplexity": 1.0000954108274611, "word_perplexity": 1.0009643183931227}}, "versions": {"pile_github": 0}}
\ No newline at end of file \ No newline at end of file
02a559f74a9105145e7d4d9c5ddea372b5b4938f5368dc8ffafc39cbe3b4c7ef
\ No newline at end of file
This source diff could not be displayed because it is too large. You can view the blob instead.
{"results": {"pile_gutenberg": {"word_perplexity": 1.0000060538585933, "byte_perplexity": 1.000001039910508, "bits_per_byte": 1.0399099672152258e-06}}, "versions": {"pile_gutenberg": 0}} {"results": {"pile_gutenberg": {"bits_per_byte": 1.2443606332351536e-06, "byte_perplexity": 1.0000012443614075, "word_perplexity": 1.0000072174665404}}, "versions": {"pile_gutenberg": 0}}
\ No newline at end of file \ No newline at end of file
ec1082ee5a5326e0d57aa4e73b634937140c1de9af95f154e8ab57b05d9b422b
\ No newline at end of file
[["\nAsk HN: What have you completed in 2015 - lakeeffect\nWhat have you completed in 2015 or are looking to release in 2016. I use completed loosely. Would love to hear some progress to motivate me into the new year.\n======\nionicabizau\nA lot of open-source stuff. The best ones were:\n\n\\- [https://github.com/IonicaBizau/git-\nstats](https://github.com/IonicaBizau/git-stats) (January, 2015) \\-\n[https://github.com/IonicaBizau/node-\ncobol](https://github.com/IonicaBizau/node-cobol) (October, 2015) \\-\n[https://github.com/IonicaBizau/gridly](https://github.com/IonicaBizau/gridly)\n(December, 2015)\n\nWorking part-time (~100 hours / month) for the company I work for, I still\nhave enough \"free\" time when I do: open-source stuff, JavaScript training,\nplaying piano, playing with high-voltage (!), playing with chemistry\nexplosions and experiments (in fact making fire almost anywhere, anytime).\n\nOne of this year goals was to drop out of college. I did it two months ago.\nSince officially I'm still a student (my documents are at the university), I'm\nstill getting loans because of my good results from my previous year. But I\ndon't regret this decission at all (at least, until now!).\n\nAll the thanks go to God! I enjoy being a Jesus follower. I believe this world\nis not our home. God prepares for us a better world. Until then, I'm happy to\nlove Him. Actually, being a believer and web developer is a nice combination.\n\nWish you a happy 2016! :-)\n\n"], ["\nGoogle relents slightly on blocking ad-blockers \u2013 for paid-up enterprise Chrome - nachtigall\nhttps://www.theregister.co.uk/2019/05/29/google_webrequest_api/\n======\nlern_too_spel\nNow Chrome will have the crippled adblocking capabilities of Safari. This is\nthe browser equivalent of removing the headphone jack \u2014 removing a feature\nmany people use to get some security benefit for a few. The problem for Google\nis that unlike Apple's customers, you can't pee on a Chrome or Android user's\nback and tell them it's raining. No Android user was happy the headphone jack\nwent away just because there are wireless options.\n\nSince Google controls the extension distribution system, it could just as\neasily plastered extensions that use this API with scary warnings, so only\nusers who knew what they were getting into would install them. It's not like\nusers install so many extensions that use this API that they would start to\nignore the warnings.\n\n------\nohpls\nAt least whatever Google decides to do I'll still have my Pi-hole blocking ads\nand trackers\n\n~~~\ndanShumway\nDomain-based filtering isn't enough to block all ads and trackers -- unless\nPi-hole is doing more than just acting as a DNS server nowadays; I haven't\nchecked in a while.\n\nIn particular, using Pi-hole forces you to decide globally what domains you'll\nblock -- so you can't (for example) block Twitter/Facebook on 3rd-party\ndomains but allow it when you directly visit them. DNS blocking also can't\nhandle individual URLs within a domain -- so you won't be able to block ads on\nsites like Youtube or Facebook.\n\nAside from lacking granularity for when domains are allowed or disallowed, Pi-\nhole also won't protect you from the majority of first-party tracking. That's\nless of a concern though because (at least for now) the V3 manifest isn't\nstopping extensions from blocking tracking cookies or disabling features like\nCanvas, so you can still rely on them for that.\n\nTypically though, I advise people to prefer extensions like UMatrix and Ublock\nOrigin, and to fall back on Pi-hole as a backup strategy when nothing else is\navailable. It's useful (particularly to help with native apps and IOT\ndevices), but I don't think it's a substitute for a good browser-based ad\nblocker.\n\n------\niamthatiam\nDoes this impact chromium? Will Brave Browser continue to function?\n\n~~~\nrasz\nYes it does, and will put a burden on forks to maintain their own patch tree.\nVivaldi already semi declared unwillingness to do it.\n\n------\nanfilt\nThat does not really seem like a \"relent\"...\n\n------\nm-p-3\nIf that's not a good reason enough to switch back to Firefox, then I don't\nknow what will.\n\n------\nwinkeltripel\nStill looks like scummy behaviour to me.\n\n"], ["\n\nPosterous (YC S08) launches group blogs that are also email lists - rantfoil\nhttp://mashable.com/2009/05/05/posterous-email-lists/\n\n======\njonas_b\nI'm not sure, but this, or some evolution of this feature, could turn out to\nbe a real revolution when it comes to group collaboration and sharing.\n\nOr it might just be another feature, or that I'm trying to see something that\nisn't there.\n\nI've been searching for a merge between this, Chatterous and possibly etherpad\nfor small-group collab. Alas, it eludes me still.\n\n~~~\nrantfoil\nWe're definitely excited about the potential for this feature to grow into its\nown product!\n\nYou can expect improvements to this coming fast and furious.\n\n------\nzaidf\nThis is ridiculously awesome and the exact feature we needed few months ago.\n\nWe started a blog for our larger extended family(50+ people) so there is a\nsimple place for all our family communication. Yet a lot of the people in our\nfamily know only rudimentary use of the computer--and that means email. So the\nblog idea didn't quite work out and we're back to emailing--which is\ndisorganized but works.\n\nWith this we can get the best of both worlds! Communicate via email, archive\non a blog!\n\n~~~\nrantfoil\nWould love for you to use Posterous for this, and would love to chat with you\noffline about how it works out for your team and how we can get better. My\ncontact info is in my profile. =)\n\n------\nhboon\nI'm more interested in how you got Mashable to cover this (so quickly).\nAnything you could share there? (yes, PR-related question).\n\nAwesome feature, I'd imagine you will be using this mechanism to allow\ndevelopers against your API to track API changes? :)\n\n~~~\nrantfoil\nFor a brand that's gaining momentum, getting coverage is a bit easier. You\nmight already follow the writers on twitter or vice versa, or if you're in\nSF/the valley you might even grab coffee with them.\n\nFor a company starting out, you need to either have a product so thoroughly\nbadass that it trounces something else that is hot, or you need a connection /\nintro to someone who knows them well.\n\nIt doesn't have to be Michael Arrington himself -- in fact, he's obviously\nsuch a busy and important guy that it's almost impossible to get his\nattention. However, at each of these blogs there are staff writers. They're\nthe ideal person to reach out to -- they're looking for great stories, and\nhey, you've got one.\n\nFinal tip: use bullet points. Include screenshots if you can. If you make it\nso compelling and so obvious that it's a story, and you practically write it\nfor them, you make their life easier and that makes it a no-brainer for them\nto write about you.\n\nAs for API -- that sounds very cool. We're very psyched about becoming a more\nopen platform to let people build apps on top of Posterous.\n\n------\njcbozonier\nI'm in love. May I have permission to marry your daughter?\n\n[I just realized we don't do humor here. Mod me down :(]\n\n------\ntybris\nWoah! I can see how this is different from an e-mail list that gets published\non the web. ;-)\n\nInteresting to see how a new layer of paint can bring old concepts to the\nmasses.\n\n~~~\nmadh\nAbsolutely. Great ideas never die.\n\n------\njoepestro\nPosterous is great. I'm interested to know - was this something that you've\nhad planned for a while, or did it come as a natural evolution of the product\n/ feedback from users?\n\n~~~\nrantfoil\nIt's something we talked about even back during last summer when we first\nlaunched Posterous. It's great to finally be able to put it out there.\n\nOnce we launched group blogs though -- we did start hearing a lot of requests\nfor this feature too. There's absolutely a validation piece to it. When users\nask for it, you know there's something there.\n\n------\nqeorge\nI've got a Google Group with old college friends which we've always wanted to\nbe richer, but we don't want to lose the casual members by trying to migrate\nand change their habits. I think hooking it up to a group posterous account\nmight be the right way to please everyone.\n\nVery cool.\n\n------\nthorax\nI wonder-- are they stripping out the reply/quotations somehow, or are those\ngoing to be showing up on the blog, too? Long, long streams of\n>>>>>>>>>>>>>||||>>>> make for crummy blogging.\n\n~~~\njoepestro\nGood point. I did an experiment a few weeks ago to see how this was handled on\nposterous by sending an email to a friend and cc'ing post@posterous.com.\n\nIt was handled well on their end with a solid line on the left side of the\nreply. So it looks like something they are already prepared for.\n\n------\nanigbrowl\nI like it, but it fails to load 3 out of 5 times when I click on a Posterous\nlink. First day traffic blues?\n\n"], ["\nThe Number of New Bitcoin Accounts Is Skyrocketing - petethomas\nhttps://www.bloomberg.com/news/articles/2017-11-27/new-crypto-accounts-proliferate-as-bitcoin-flirts-with-10-000\n======\nblackflame7000\nAt what point does it become speculation? Perhaps someone with more knowledge\ncan shed some light on what exactly people are buying when they buy a bitcoin.\n\n"], ["\nSymbolic expressions can be automatically differentiated too - objections\nhttp://h2.jaguarpaw.co.uk/posts/symbolic-expressions-can-be-automatically-differentiated/\n======\njohnbender\nOne can also calculate the derivative of a context free grammar with respect\nto a given terminal.\n\n[http://matt.might.net/articles/parsing-with-\nderivatives/](http://matt.might.net/articles/parsing-with-derivatives/)\n\n~~~\nApanatshka\nThat's also a really cool article. Thanks for sharing it!\n\n------\ndelluminatus\nGreat post, as an AD tutorial and as a (an?) Haskell exercise. Having known\nnothing about AD before, I feel like I have a good understanding of what it is\n-- as he says, it's so simple -- but I don't understand _why_ the algorithm is\nso much faster. Just looking at the differentiator function and the AD\nfunction, it actually appears that the AD should take longer because it does\nmore computation per step (both the function and the derivative). But it seems\nlike every article or paper is talking about how to implement AD, not why the\nalgorithm is so efficient. Does anyone happen to know of a good article or\npaper about that? Ideally, one just as nice and comprehensible as this!\n\n~~~\nvidarh\nThe first alternative builds a large tree structure, and then evaluates the\nwhole tree structure afterwards.\n\nSo first it blows up the size of the expression to process and _then_ it\ncalculates the result. A lot of those calculations will be redundant\n\nThe second one not only avoids evaluating the tree separately, but \"prunes\" a\nlot of the evaluation automatically by effectively short-circuiting the whole\nprocess. Consider (with the caveat that my Haskell understanding is\nrudimentary at best and it was about 20 years since I last did anything\ninvolving symbolic differentiation) :\n\n \n \n Product e e' -> Product e (diff e') `Sum` Product (diff e) e'\n \n\nFollowed by a separate run with:\n\n \n \n Product e e' -> ev e * ev e'\n \n\nvs\n\n \n \n Product e e' -> let (ex, ed) = ev e\n (ex', ed') = ev e'\n in (ex * ex', ex * ed' + ed * ex')\n \n\n(I pick the \"Product\" rule as an example because it is one of the ones that\nblows up the size of the tree)\n\nLets say you do something simple like Product X X. You get Sum (Product X One)\n(Product One X) out, and then you have to evaluate each node.\n\nIn the second case, you match Product e e'. You process X and assign (x,1) to\n(ex,ed), and process the second X and assign (x,1) to (ex', ed'), and then\nreturn (ex * ex', ex * ed' \\+ ed * ex').\n\nIn the first case, you've first differentiated 3 nodes (Product + 2x \"X\"),\nthen evaluated the 7 nodes that was produced as output, for a total of ten\nnodes processed.\n\nIn the second you've evaluated/differentatiated 3 nodes in one go without the\nintermediate step of having to evaluate a much larger tree.\n\nIn a large example, the number of nodes in the differentiated output quickly\nexplodes and subsequent evaluation would increase rapidly in cost.\n\n~~~\namelius\nFrom an asymptotic complexity viewpoint, I don't see any difference between\nthe two algorithms (AD versus building an expression tree and doing it\nsymbolically, then evaluating). Both are linear in the \"size\" of the\nexpression. So I don't understand what you mean by \"quickly explodes\".\n\n~~~\ntome\nSee here:\n\n[http://h2.jaguarpaw.co.uk/posts/why-is-naive-symbolic-\ndiffer...](http://h2.jaguarpaw.co.uk/posts/why-is-naive-symbolic-\ndifferentiation-slow/)\n\nIn summary, yes both functions are linear, but the size of the symbolic\nderivative is quadratic.\n\n------\nkazinator\nI would swear I read some Lisp-related paper about this with some nice Lisp\nadvocacy in it, too.\n\nAha, here it is:\n[http://www.cs.berkeley.edu/~fateman/papers/ADIL.pdf](http://www.cs.berkeley.edu/~fateman/papers/ADIL.pdf)\n\n> _For fans of Lisp, there is no question that one motivation is to show how\n> easy it is to implement in Lisp. Lisp provides a natural representation for\n> programs as data and a natural form for writing programs that write\n> programs, which is what we do in ADIL. The code is short, and is in ANSI\n> standard Common Lisp. Since it is not \"naive\" code written in just the\n> obvious idioms of introductory Lisp, it illustrates, for those with only a\n> cursory familiarity, that Lisp is more than CAR and CDR. In fact we did not\n> use those routines at all._\n\nAh, but you did use A and D. cAr-tomatic cDr-ivation!\n\n:)\n\n------\ndavexunit\nSurprised to not see SICP's excellent section on symbolic differentiation not\nmentioned here:\n[http://sarabander.github.io/sicp/html/2_002e3.xhtml#g_t2_002...](http://sarabander.github.io/sicp/html/2_002e3.xhtml#g_t2_002e3_002e2)\n\n------\n33a\nAlgebraically, automatic differentiation is the same as adding a nilpotent\nelement e, such that e^2=0 to your algebra. You can continue this pattern out\nto get higher order derivatives. For example, if you also add an element f\nwhere f^3=0, the coefficient of f is proportional to the second derivative.\n\n~~~\namelius\nSounds interesting. But could you explain this so that people without a\npostgraduate degree in mathematics can understand this?\n\n~~~\neru\nThe idea might be similar to [https://en.wikipedia.org/wiki/Non-\nstandard_analysis](https://en.wikipedia.org/wiki/Non-standard_analysis)\n\nEdit: Actually,\n[https://en.wikipedia.org/wiki/Smooth_infinitesimal_analysis](https://en.wikipedia.org/wiki/Smooth_infinitesimal_analysis)\nseems much closer.\n\n~~~\namelius\nInteresting. But...\n\n> by denying the law of the excluded middle, e.g., NOT (a \u2260 b) does not imply\n> a = b\n\nOuch, that is where my brain starts hurting.\n\n------\nturkishrevenge\nConal Elliot's paper on the subject is a really good starting point:\n[http://conal.net/papers/beautiful-\ndifferentiation/beautiful-...](http://conal.net/papers/beautiful-\ndifferentiation/beautiful-differentiation.pdf)\n\n------\nnwhitehead\nThis example makes it really clear what's going on. Could someone translate it\nto do reverse automatic differentiation? That's the one I never quite\nunderstand.\n\n~~~\nAnimats\nYou mean integration? That's much harder, but has been done automatically.\nSymbolic differentiation is easy, because you can just keep blindly applying a\nset of rewrite rules until none of them apply. That process converges on a\nunique result. Symbolic integration doesn't converge in that way. More\nstrategy is required, and you're not guaranteed a closed form solution.\nMathematica has a good symbolic integrator.\n\n~~~\ncperciva\n_Symbolic integration doesn 't converge in that way. More strategy is\nrequired, and you're not guaranteed a closed form solution._\n\nHowever, if a closed-form solution exists which can be expressed in terms of\nthe operations + - * / exp log, then it is guaranteed to be found.\n\n------\neru\nGreat article! It might benefit from a comparison with Oleg's Typed tagless-\nfinal interpretations ([http://okmij.org/ftp/tagless-\nfinal/course/](http://okmij.org/ftp/tagless-final/course/)).\n\n------\njfoutz\nThe thing that's great about the typeclass approach, you can do anything you\nwant behind the implementation. you can numerically evaluate the expression,\nbut even cooler, you can recover the parse tree. I never could sort out how to\ndeal with 'if', because it's not typeclassed. if it was, boy could you do some\namazing stuff. partial differentiation, tree rewriting, with the LLVM stuff\nyou could runtime compile arbitrary functions. super neat trick.\n\n~~~\nmpweiher\nThat's also what's great about pure dynamic systems like Smalltalk. 'ifTrue:'\nis just a message-send, the class \"False\" doesn't evaluate the block\nparameter, the class \"True\" does. And yes, you can then recover the parse\ntree, for example the Gemstone OODB lets you use arbitrary blocks (closures)\nas queries, recovers the parse tree and then creates an optimized query from\nit. Quite neat.\n\n------\nfinin\nTHE LISP DIFFERENTIATION DEMONSTRATION PROGRAM, K. Kaling, Artificial\nIntelligence Project, RLE and MIT Computation Center, AI Memo 10, 1959.\n\n[https://archive.org/stream/bitsavers_mitaiaimAI_878286/AIM-0...](https://archive.org/stream/bitsavers_mitaiaimAI_878286/AIM-010_djvu.txt)\n\nftp://publications.ai.mit.edu/ai-publications/pdf/AIM-010.pdf\n\n------\ndeepnet\nCould Automatic Differentiation learn by applying learnable weights to the\ncompiled S-expression atoms - Backpropagating Errors applied with Stochastic\nGradient Descent ?\n\nA program would constrain a task with functional statements, which is then\ncompiled to weighted s-expressions which learn the specific task from training\ndata.\n\nA sort of Neural-net functional program hybrid.\n\n------\nnotthemessiah\nDual numbers are just a means of formalizing the properties of an epsilon (a\ntiny change in calculus), and are the means of preserving enough information\nto think of the function and its derivative at the same time. EG: (x + \u03b5)\u00b2 =\nx\u00b2 + 2\u03b5 + \u03b5\u00b2, but \u03b5\u00b2 = 0, so we get x\u00b2 + 2\u03b5, (a tiny change squared becomes an\ninsignificantly tiny change)\n\n------\nplatz\nForward-mode AD doesnt't really scale. Reverse kode AD is useful for the\nbackpropigation algo in machinelearning however\n\n~~~\nguest1539\nWhat part doesn't scale?\n\n~~~\nkxyvr\nI'm a little late to the party, but hopefully this'll explain.\n\nBasically, you have to be careful about what it means to scale or not scale.\nIf all you want is a derivative with respect to a single variable, forward\nmode scales just fine, great in fact. However, if you want the gradient, or\nthe derivative with respect to every variable, then the forward mode does not\nscale well at all with respect to the number of variables. Specifically,\nassume we have m variables. In order to calculate the derivative of an\nexpression with respect to 1 variable is 2 times the cost of a function\nevaluation, 2 * eval. In order to see this, it's easiest to note that we don't\nneed an expression tree for forward mode AD like the article uses. Really, we\ncan get away with just a tuple that contains the function evaluation as the\nfirst element and the partial derivative as the second element. Then, all of\nthe rules are basically the same as the article, but we're always doing one\noperation on the first element, whatever the function is, and a different\noperation on the second element for the partial derivative. This is twice to\nwork, so 2 * eval. Since we have m variables, this becomes 2 * m * eval. And,\nyes, memory layouts, fewer optimizations for algebraic data types compared to\nfloats, etc. mean that it's actually slower, but, honestly, it's pretty fast.\n\nThe reverse mode is different because it turns out that it can calculate the\nentire gradient, or all m partial derivatives, with 4 * eval cost. Note, this\nis independent of the number of variables. Proving this is a pain, so I can't\ngive a good explanation here. Realistically, source code transformation tools\nperform around 10-20 * eval. Operator overloading tools perform around 20-30 *\neval, so it's slower in practice, but pretty damn good.\n\nNow, unlike the forward mode, where we really only need a tuple to carry\ninformation, the reverse mode does require an expression tree. In order to\nunderstand why, it helps to note that the forward mode is really a directional\n(Gatteaux) derivative and the reverse mode is the total (Frechet) derivative.\nThis affects how the chain rule manifests. Specifically, the forward mode\nrepeatedly applies two rules\n\n(f o g)'(x) dx = f'(g(x)) g'(x) dx\n\n(f o (g,h))'(x) dx = f'(g(x),h(x)) (g'(x)dx,h'(x)dx)\n\nBasically, in the function evaluation, we do some operation g before f. In\norder to figure out the derivative, we also do the g derivative operation\nbefore the f derivative operation. The first rule is for unary operations like\nnegation and the the second rule is for binary operations like addition.\nAnyway, the reverse mode takes the Hilbert adjoint of this. Specifically:\n\n(f o g)'(x)^* = g'(x)^* f'(g(x))^*\n\n(f o (g,h))'(x)^* = [g'(x)^* h'(x)^* ]f'(g(x),h(x))^*\n\nWe care about the adjoint because of a trick from the Riesz representation\ntheorem. Specifically,\n\nf'(x)dx =\n\n(f'(x)dx)1 =\n\n<f'(x)dx,1> =\n\n<dx,f'(x)^* 1> =\n\n<dx,grad f(x)>\n\nwhere <.,.> denotes the inner product. Anyway, basically the gradient of f is\nthe adjoint of the total derivative of f applied to 1. Therefore, if we knew\nthe adjoint of a computation applied to 1, we'd get the gradient. In other\nwords, we can rewrite the chain rule above as\n\ngrad (f o g)(x) = g'(x)^* grad f(g(x))\n\ngrad (f o (g,h))(x) = [g'(x)^* h'(x)^* ]grad f(g(x),h(x))\n\nThat's the core of reverse mode AD. Note, many, if note most descriptions of\nreverse mode AD talk about doing the chain rule in reverse and then they add\ndual variables, etc. That may be a description that's helpful for some, but\nnot for me. In truth, it's just a bunch of adjoints applied two one and\nknowing the Riesz representation trick.\n\nNow, the reverse mode AD does require an expression tree to be kept. The\nreason for this is that the computation about did g before f. However, if we\nlook at the chain rule we have\n\ngrad (f o g)(x) = g'(x)^* grad f(g(x))\n\nThis means that in order to calculate the gradient of the composition, we need\nto know the gradient of f first even though we did the evaluation of g first.\nHowever, we need to know the evaluation of g in order to calculate the\ngradient of f. The way we resolve this is that we evaluate the functions in\norder, but keep an expression tree of what we did. This gives all of the g(x),\nf(g(x)), etc. Then, we run over that expression tree backward to calculate all\nof the gradients. Because we run over the expression tree backwards, we call\nthis the reverse mode.\n\nHow we run over the expression tree backwards is important and tricky to do\nright. The way that we can sort of see that we can do everything in 4 * eval\ncost is that the trick is not to create multiple vectors to store the gradient\nwhen running over the tree, but to have 1 vector and to update this vector\nwith the new derivative information when required. Basically, we're just\ninserting information in the right spots, which can be done efficiently. In\npractice, storing the expression tree in memory can be really expensive. For\nexample, imagine a for-loop that had 10 billion loops. That's a really long\nexpression tree to hold in memory. Now, source code transformation tools are\nreally clever and don't actually store all of those expressions in memory, but\njust run back the for loop, which is why they're more efficient. Operator\noverloading techniques (algebraic data types) can technically optimize this as\nwell by doing some interesting caching techniques. However, the overall idea\nis that it can be expensive and there are lots of ways to do this wrong, but\nalso lots of places to do things right and be creative.\n\nAs aside to a comment left above, back propagation is indeed just reverse mode\nAD combined with a nonglobally convergent version of steepest descent. I've\nnever seen a paper that worked this out, but it's something that's known\nwithin the AD community. Someone, someday, should really write that down.\n\nAnyway, that's probably a much too long response to your simple question. In\nshort, forward mode doesn't scale when calculating gradients because the cost\nis 2 * m * eval whereas the reverse mode can do it in 4 * eval. For a single\nvariable, or an entire directional derivative, the forward mode scales fine\nand in fact works better than the reverse mode for this case.\n\nEdit: This formatting is killing me. Hopefully, it all looks fine now.\n\n"], ["\n\nIs the Solar System Really a Vortex? - swamp40\nhttp://www.universetoday.com/107322/is-the-solar-system-really-a-vortex/\n\n======\nswamp40\nIt's not often you get to hear an astrophysicist say \"...then we\u2019re all\nbuggered.\"\n\n"], ["\nTheoretical Motivations for Deep Learning - rndn\nhttp://rinuboney.github.io/2015/10/18/theoretical-motivations-deep-learning.html\n======\nchriskanan\nThere is a recent 5 page theoretical paper on this topic that I thought was\npretty interesting, and it tackles both deep nets and recurrent nets:\n[http://arxiv.org/abs/1509.08101](http://arxiv.org/abs/1509.08101)\n\nHere is the abstract:\n\nThis note provides a family of classification problems, indexed by a positive\ninteger k, where all shallow networks with fewer than exponentially (in k)\nmany nodes exhibit error at least 1/6, whereas a deep network with 2 nodes in\neach of 2k layers achieves zero error, as does a recurrent network with 3\ndistinct nodes iterated k times. The proof is elementary, and the networks are\nstandard feedforward networks with ReLU (Rectified Linear Unit)\nnonlinearities.\n\n------\narcanus\n1) I am curious about learning more about the statement: \"Deep learning is a\nbranch of machine learning algorithms based on learning multiple levels of\nrepresentation. The multiple levels of representation corresponds to multiple\nlevels of abstraction. \"\n\nWhat evidence exists that the 'multiple levels of representation', which I\nunderstand to generally be multiple hidden layers of a neural network,\nactually correspond to 'levels of abstraction'?\n\n2) I'm further confused by, \"Deep learning is a kind of representation\nlearning in which there are multiple levels of features. These features are\nautomatically discovered and they are composed together in the various levels\nto produce the output. Each level represents abstract features that are\ndiscovered from the features represented in the previous level. \"\n\nThis implies to me that this is \"unsupervised learning\". Are deep learning\nnets all unsupervised? Most traditional neural nets are supervised.\n\n~~~\njoe_the_user\nThe whole presentation seems very hand-wavy, which I think is pretty much the\nlevel most motivational discussions of deep learning are at.\n\nI think the presentations by Yann Lecun and Leon Bottou are more interesting -\nand tend to involve more uncertainty and fewer pronouncements.\n\nsee:\n[https://news.ycombinator.com/item?id=9878047](https://news.ycombinator.com/item?id=9878047)\n\n~~~\narcanus\nThis was fascinating and greatly informative. As you said, the authors were\nnot afraid to show the real warts and bleeding edge, as a good scientist\nshould. Thanks for the link.\n\n------\ndnautics\nI wonder if \"lots of data\" is wrong. If I show you say twenty similar-looking\nChinese characters in one person's handwriting, and the same twenty in another\nperson's handwriting, you'll probably do a good job (though maybe not an easy\ntime) classifying them with very little data.\n\n~~~\nwebmasterraj\nBecause I've seen lots of other handwriting, even if in another language. I\nhave very strong priors.\n\nThe problem is that a computer comes in without knowing anything about\ntangential phenomenon. So it needs lots of data to catch up to me and my years\nof forming associative connections about other handwriting I've seen.\n\nIf I showed you alien (ie not human) handwritten samples, you'd probably\nstuggle too.\n\n------\nilurk\nWhat tools did you use to make those nice pictures?\n\n(didn't read it yet though, will do when I have time)\n\n------\nmemming\nNice. Very well organized.\n\n"], ["\nApply HN:Programmable matter - YuriyZ\nGoal: creation of programmable matter, consisting of many microscopic particles (c-atoms). Which can be manipulated to create a user programmed 3d form.<p>Achievements. Verified experimentally:<p>- ways to connect c-atoms with each other;<p>- the movement of c-atom relative to other c-atoms.<p>The experiments were conducted with models of c-atoms in the macro scale. The size of c-atoms models was 3 * 4 cm.<p>Tasks:\n- development of software capable of managing an array of c-atoms;\n- repeating experiments in micro-scale with size of c-atoms - less than one millimeter.\n======\npjlegato\nWhat are the possible commercial applications of this technology?\n\nHow will your company make money from this?\n\n~~~\nYuriyZ\n\\- Programmable matter will replace 3d prototyping, which is now carried out\nby 3d printers. \\- Will be used in telecommunications. The effect of presence\n- Pario. \\- The technology will be used in medicine. The surgeon will be able\nto operate on the patient by manipulating programmable matter, which will be\nan enlarged, precise, copy of the operated area. \\- Toys (gadgets)\ntransformers. The company will make money by selling and renting devices from\nthe programmable matter.\n\n"], ["\nAsk HN: Good Resources for Data Engineering - fargo\nI am looking for some example case studies&#x2F;exercises in order to learn play with some libraries, is there a book or website you can recommend?\n======\niso1337\nkleppmann\u2019s book: designing data-intensive applications.\n\nIt\u2019s very well written, but maybe doesn\u2019t have as much in the way of\nexercises.\n\n~~~\nfargo\nThanks for the excellent recommendation, I have been through kleppmann's book\nand it's a must for anyone who wants to be serious about data engineering (or\nwhatever it's called these days). I am looking however for something more\npractical and less technical, maybe something like projecteuler or cracking\nthe cracking the coding interview but for data\n\n~~~\niso1337\nIMHO data eng is too niche and new for that kind of content. But I would love\nto see if there is anything out there like that.\n\nIs the goal here to get through system design interviews or something like\nthat? You can check out pramp.com if so.\n\nIf it\u2019s for learning, then reading some of the original Google papers behind a\nlot of the big data technologies has been very rewarding for me. You could try\nreimplementing the paxos algorithm for example.\n\n~~~\nfargo\nI am a bit rusty with spark and I have a practical interview where I will be\ngiven various datasets to extract insights from them.\n\n"], ["\n\nAsk HN: What is happening on Mt. Gox right now? - ljd\n\nhttp:&#x2F;&#x2F;www.bitcoin.clarkmoody.com&#x2F;\nI&#x27;m not sure if anyone has been watching but someone is buying 729.2489 BTC at 850. Which wouldn&#x27;t be unusual if it wasn&#x27;t following by an exact buy of 135 of 775 right after. This cycle has happened, with the exact same amounts for the past 30 minutes and I don&#x27;t know what to make of it. It&#x27;s just in a loop. The market won&#x27;t move either way.<p>I know there is some kind of gaming going on, I just don&#x27;t know what it is, yet. Any ideas?\n======\nwashedup\nI can confirm this. I have watched the cycle happen from ~877 to ~829 ten\ntimes now, with a bid size of 729 at 850 every single time. As soon as a price\naround 829 is filled, it shoots back up to 877. Each time the cycle lasts\nroughly 5 minutes.\n\n------\nChrisClark\nIt's a bug. Mt. Gox had the same repeating bug before.\n\nBasically, don't trade on Mt. Gox. It's not a good idea.\n\n"]]
\ No newline at end of file
{"results": {"pile_hackernews": {"word_perplexity": 1.0007891841996295, "byte_perplexity": 1.0001263619959735, "bits_per_byte": 0.00012635401296887477}}, "versions": {"pile_hackernews": 0}} {"results": {"pile_hackernews": {"bits_per_byte": 0.00010170276359193358, "byte_perplexity": 1.0001017079354932, "word_perplexity": 1.0006273924348839}}, "versions": {"pile_hackernews": 0}}
\ No newline at end of file \ No newline at end of file
520ea6e04e8a39dc0b5f63a837429a78a40e63d39d109096101feb8c5b2cf8d8
\ No newline at end of file
[["Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a major advance in the field of biological mass spectrometry, providing high ion yields for low quantities of biopolymers, even in the presence of buffer and other biochemical components that would preclude analysis by other mass spectral techniques. Much of the research in the field to date has concentrated on peptides, proteins, and oligonucleotides, yet the technique also holds considerable promise for oligosaccharide analyses beyond the limited work that has already been demonstrated. This application seeks to develop methods to increase the applicability of MALDI-MS to challenging problems in glycobiology. The research will be directed towards four goals: (1) maximizing the sensitivity for biopolymers, particularly oligosaccharides and glycoconjugates and their derivatives; (2) increasing the information content of the spectra through sample pretreatments that induce mass shifts; (3) developing experimental methods to enhance fragmentation and obtain systematics to help determine the rules of post-source decay fragmentation; and (4) exploring approaches to the analysis of two-dimensional surfaces, both those that occur in biological or biomimetic systems, and those that result from analytical separations or the generation of synthetic product arrays."], ["The proposed research will continue investigations on: Alterations in the metabolic pool at the site of both aqueous humor and cerebrospinal fluid production in groups of unoperated animals and in animals in which bilateral superior cervical sympathetic ganglionectomy will be performed. In addition, a third group of sham-operated animals will be used. That supersensitivity to catecholamines takes place in sympathectomized animals has been shown previously in this and other laboratories. Ocular fluid dynamics in all animals will be determined by repeated weekly testing of ocular pressure (IOP) (with the Mackay-Marg tonometer), as well as measurement of pupil size. A base-line of measurements will have been made for a period of about two weeks prior to the performance of any surgery. Again, prior to surgery, this base-line will be followed by a period of testing of ocular responses to phenoxybenzamine (PBA), norepinephrine (NE) and prostaglandins (PGs). At the termination of a study, the animals will be anesthetized, the eyes cannulated and outflow resistance and IOP measured. The animals will then be sacrificed and tissues and fluids removed. The metabolism and/or concentrations of cyclic adenylic acid (cyclic-AMP) and the PGs will be of particular interest. We shall, in other words, look closely at the metabolic and physiological nature of ocular and non-ocular tissues in animals which would be made catecholamine-supersensitive (via the above surgical procedure) at sites of sympathetic innervation as compared with sham-operated and unoperated animals. BIBLIOGRAPHIC REFERNCES: Waitzman, M.B. and Law, M.L., \"Changes in Prostaglandin Concentration in Blood Subjected to Repetitive Freezing and Thawing\", Prostaglandins 10:949-962, 1975. Jackson R.T., Waitzman, M.B., Pickford, L. and Nathanson, S.E., \"Prostaglandins in Human Middle Ear Effusions\", Prostaglandins 10:365-371, 1975."], ["This loss of membrane-bound beta-adrenergic receptor recognition sites in frog erythrocytes during subsensitivity induced by exposure of cells to isoproterenol is due at least in part to the internalization of beta-receptor recognition sites. When the beta-adrenergic receptors are resensitized, the amounts of internalized receptor recognition sites are also returning to the normal. The internalized recognition sites of beta-adrenergic receptors have properties identical to those of membrane-bound receptors. Glycoprotein seems to play an important role in triggering this event. A series of tricyclic antidepressant drugs was found to inhibit the receptor internalization; these effects can be attributed to an inhibition of these drugs of the agonist binding to beta-receptor recognition sites in erythrocyte membranes. Inhibitors of transglutaminase diminish the extent of recognition site internalization with a concomitant prevention of the loss of membrane-bound receptors elicited by isoproterenol. There is an excellent correlation between the inhibition of recognition site internalization and the potency of these compounds to block the transglutaminase activity in vitro."], ["Activation of cellular tumor suppressor pathways is the cell's major defense against cancer induced by activated oncogenes. The ARF-p53 tumor suppressor pathway, which is one of the most important in mammalian cells, can be activated by a number of viral and cellular oncogenes. Activated ARF induces a p53-mediated block to cell division via a cell cycle arrest or apoptotic cell death. How oncogenic stress activates ARF remains to be elucidated. Polyoma virus middle T-antigen (PyMT) is a potent oncogene able to bind a number of key regulatory cellular proteins and activate a number of important cellular signaling pathways including the ARF-p53 tumor suppressor pathway. We will use PyMT as a model oncogene in order to better understand how ARF is being activated. We hypothesize that PyMT induces ARF by the inappropriate activation of one or more cellular signaling pathways that also mediate the ability of PyMT to transform cells. Our plan is to identify the cellular signaling pathways induced by PyMT that results in activation of ARF. REF52 cells differ from most other established cell lines in containing an intact ARF-p53 tumor suppressor pathway and are distinct in resembling primary cells in their requirement for oncogene cooperation for their transformation. PyMT activates the ARF-p53 pathway, blocking REF52 cell division and will not transform REF52 cells in the absence of a co-operating oncogene. We plan to take advantage of these unique properties of REF52 cells to isolate PyMT and cell mutants that are involved in the activation of ARF. Three interrelated aims will be pursued. Aim-1 will be to use previously isolated PyMT mutants to identify which domains of PyMT are required to activate ARF. Aim-2 will be to use mutagenized PyMT to identify sequences in both defined and undefined regions of PyMT that are required for the activation of the ARF-p53 pathway in REF52 cells. Aim-3 will be to isolate and define REF52 cellular mutants in which PyMT signaling fails to activate the ARF-p53 tumor suppressor surveillance pathway. We believe that such cell mutants will help to differentiate between normal and oncogene activation of important cellular signal transduction pathways. Understanding the mechanism(s) of oncogenic activation of the ARF-p53 tumor pathway will help in designing better drugs and therapies for the treatment of cancer."], ["Current studies have indicated that antibodies directed against the stalk region of CD23 cause enhancement of IgE synthesis in both the human in vitro and mouse in vivo systems. CD23 transgenic mice, which overexpress CD23 on all lymphocytes and FDCs, exhibit drastically reduced IgE production in both helminth and alum/ag models. The data suggest a model where the role of CD23 is initially to serve as a component of innate immunity to signal for IgE production by becoming destabilized and cleaved and later by overexpressing at the cell surface and modulating IgE production. This continuation application proposes to investigate the mechanism of these effects. Aim#1 examines the mouse system where the destabilizing mab 19G5 gives enhanced IgE synthesis in vivo. In the current funding, the metalloprotease, ADAM10 has been identified as the primary CD23 sheddase in mouse and humans. The role of ADAM10 in allergic disease will be modeled by making transgenic mouse that overexpress ADAM10 or dominant negative ADAM10. In addition, we will examine the mechanism for the 19G5-induced IgE production by investigating the association of CD23 with another negative signaling molecule, LAX, which has recently been shown to both modulate CD23 expression and regulate IgE levels. Aim#2 will investigate the affect of CD23 overexpression and CD23 destabilization on the mouse asthma model with respect to both modulation and exacerbation of disease. We will utilize both IgE and the new ADAM10 transgenics in order to evaluate the mechanism of the suppression of eosinophilia as well as the capacity of CD23 to modulate the asthma phenotype. Aim#3 will investigate the human in vitro IgE synthesis models with respect to the mechanisms involved in IgE synthesis enhancement, seen with anti-stalk antibodies and synthesis suppression, seen with certain anti-lectin mabs. The importance of ADAM10 in human CD23 cleavage and IgE production will also be explored as will the involvement of LAX. Finally, we will determine if IgE production by B cells obtained from normal and allergic subjects is affected differently by destabilization or stabilization of CD23. In summary, these studies examine the mechanism of action of a natural regulator of IgE production, CD23, with the objective of developing protocols to enhance CD23 expression and thereby regulate IgE, and by analogy, allergic disease in which IgE plays a dominant role.Project Narrative: This project examines mechanisms involved in control of IgE synthesis by a natural regulator. The latter is CD23, a low affinity receptor for IgE. Accumulated evidence indicates that cleavage of CD23 by the metalloprotease ADAM10 increases IgE production in both mouse and humans. This application proposes to study mechanisms involved in this regulation in order to develop new protocols to control allergic disease."], ["The population in the U.S. is aging. Data are sparse on cardiovascular disease in older adults, especially the oldest old, who are expected to number 18 million by 2050. The age at first myocardial infarction (MI) is increasing substantially, now to about age 71, and most of the cardiovascular disease (CVD) deaths in the U.S. occur in older individuals. CVD is the single most important cost of medical care in the US. In the early 1990s, the Cardiovascular Health Study (CHS), an NHLBI-supported cohort study of risk factors for coronary heart disease (CHD) and stroke in adults 65 years or older, recruited 5888 participants, who underwent extensive examinations at baseline and annually until 1999. Examinations included traditional risk factors as well as measures of subclinical disease. Since 1990, CHS has made major contributions to the understanding of CVD in older individuals, including the risks associated with subclinical disease, inflammation, diabetes, hypertension and renal disease. In 2008, CHS has the unique opportunity of evaluating clinical CHD in the oldest old-- MI, angina, heart failure (HF), atrial fibrillation-- and their contributions to both morbidity and mortality. In this competing continuation, the primary aim is to evaluate the incidence and determinants of cardiovascular disease and health in 1964 surviving participants aged 80 or older. The incidence of CVD in the oldest old will be related to demographic variables;measures of disability, physical functioning, and cognitive function;measures of subclinical disease;and traditional and novel clinical risk factors as well as their change over time. The data collection proposed in this application will increase the number of events in those 80 years and older by about 50% to 75% for stroke, CHD and HF. Power for assessing associations with stroke, CHD and HF is increased by 15 to 30%. The continued assessment of cardiovascular events is essential to identify determinants of successful cardiovascular aging among the oldest old. PUBLIC HEALTH RELEVANCE: Additional knowledge about the determinants of cardiovascular health and disease in older adults will help physicians and their patients make diagnostic, prognostic, and therapeutic decisions. PROJECT/"], ["Despite the detailed understanding of signals 1, 2, and 3 and the critical function of T cells and dendritic cells (DC), clinical transplantation tolerance is almost never achieved. In transplantation, models of immune function, and interventions designed to promote immune regulation, have been based on simplified receptor-ligand and cell-cell interaction models. There has been a lack of studies that place immunological recognition within anatomic contexts and evaluate the critical role of anatomic microdomains in the regulation of the immune response. Our preliminary data now demonstrate that during tolerization there is alloantigen specific clustering of T cells and plasmacytoid DC (pDC) in the T cell areas of the lymph node (LN) near the abluminal surface of the high endothelial venules (HEV). Within these clusters T cells undergo either priming or development into de novo CD4+CD25+ regulatory T cells (Treg). Importantly, B cells presenting specific alloantigen are present in these cell clusters and contribute to Treg development. We hypothesize these multicellular clustered interactions in the LN are key to the induction and maintenance of tolerance. Specifically, we hypothesize that the precise interaction of T-APC (pDC, B cells) determines the outcome of T cell migration, positioning, proliferation, and maturation, and ultimately whether rejection or tolerance are induced. This hypothesis integrates many of the known receptor-ligand and cell-cell interactions, and places these interactions in the context of secondary lymphoid organ structure. To investigate the role of these cellular and structural elements in LN clustered interactions, we propose the following specific aims: Specific Aim 1. What are the T-APC-LN interactions that are important for tolerance? Using a transplant model that allows the tracking of alloantigen specific T cells, specific alloantigen presenting APC, and the positioning of the cells with respect to HEV, we will characterize specific receptor-ligand interactions between T-HEV and pDC-HEV that regulate tolerance, and define the interactions between T-pDC and other DC that determine the generation of Treg and inhibition of effector T cells within the LN. Specific Aim 2. How is the priming of effector T cells altered in the LN during tolerance? This aim will study the specific interaction between effector T cells and Treg in the LN clusters during tolerization. We will determine how Treg-T effector interactions regulate effector T cell migration, proliferation, and differentiation. We will characterize specific receptor-ligand interactions between Treg and T effector cells that determine whether T cell priming and rejection versus suppression and tolerance predominate in the immune response. Specific Aim 3. Determine the role of B cells in tolerance in the LN We will investigate the roles of B cells in the LN during the T-APC clustered interaction that results in tolerization. Analyses will focus on B cell APC function, chemokine production, and immunoglobulin production. PUBLIC HEALTH RELEVANCE: The research will investigate and define the cellular and molecular interactions that are important for generating regulatory suppressor T cells and antigen specific tolerance. The ability to define these interactions and achieve tolerance is important for achieving good graft survival and patient survival for transplant recipients."], ["Population-based surveys estimate that the prevalence of methamphetamine (meth) use is 20 times higher among men who have sex with men (MSM) compared to the general population. Meth-associated sexual risk behavior is also a driving force in the MSM HIV epidemic: meth use is consistently associated with high-risk sexual behavior and sexually transmitted diseases, including HIV. Despite these alarming data, relatively few interventions have been tested among meth-using MSM, and no studies have tested the efficacy of pharmacologic interventions in reducing meth use in this population. Pilot studies demonstrate that aripiprazole (Abilify), an FDA-approved, well-tolerated antipsychotic and partial dopamine agonist, reduces the effects of meth in humans, decreases meth craving, and exhibits an excellent safety profile. Partial agonists - - ligands with target receptor affinity but low intrinsic activity - - have already been shown to be effective in treating opiate and nicotine dependence. In response to the compelling evidence supporting aripiprazole and the partial agonist approach, we propose conducting a randomized, double-blind, placebo- controlled trial of intermediate size (60 participants) and length (90 days of follow-up) to assess the efficacy of aripiprazole in reducing meth use among meth-dependent, sexually active MSM. Our specific aims are: 1) To test the hypothesis that aripiprazole 20 mg daily will reduce meth use significantly more than placebo among meth-dependent MSM, as determined by the proportion of meth-negative urines and by self- report of meth use in the aripiprazole versus placebo group. 2) To measure the acceptability of aripiprazole and placebo among meth-dependent MSM, by determining (via electronic pill caps and self-report) medication adherence to aripiprazole and placebo. 3) To measure the safety and tolerability of aripiprazole and placebo among meth-dependent MSM, as determined by the number of adverse clinical events in the aripiprazole and placebo arms. All participants will receive HIV risk-reduction counseling and brief substance use counseling. If promising, we anticipate that study results will be used to design a phase III study to determine if aripiprazole's effects on reducing meth use lead to significant reductions in meth-associated sexual risk and, if the trial sample size is appropriate, HIV incidence. [unreadable] [unreadable] [unreadable]"], ["DESCRIPTION (taken from the application: The concept of orthopaedic research encompasses a very broad area spanning the field of musculoskeletal research from the clinical aspect to both biological and biomechanical basic sciences. The Bioengineering and Orthopaedic Science Gordon Research Conference has been the premier conference for investigators to combine the three disciplines of orthopaedic sciences. The format of these conference, held in a small private school in New Hampshire every other summer, provides a unique environment for the exchange of new and unpublished information. The cross fertilization that can only come from a conference were clinicians, engineers and biologists meet, is more vital to this field than ever before. For the first time, biological principles are being used in the treatment of orthopaedic injuries and diseases and these treatments must confirm to and meet the biomechanical needs of the skeleton. For example, investigators and clinicians are now beginning to utilize molecules such as parathyroid hormone; parathyroid hormone related peptide, insulin-like growth factors, fibroblast growth factor and the hedgehogs. In addition, physical parameters such as ultrasounds and electrical stimulation are currently being used to stimulate bone growth. Bioengineering has moved from the mechanical design of joint implants to the problems associated with long-term use of implant materials and a general understanding of the biological effects of these materials in the living system. For the first time, a session will be held on skeletal tumors. Two noteworthy changes will be made in the format from previous meetings. In the comments on the 1996 Gordon Conference Program, many attendees felt that the junior members of the audience did not feel comfortable speaking out and that the discussion were dominated by only a few senior scientists. In an effort to increase the comfort level of the junior scientists and to increase their participation in general, the following changes were implemented: (1) five to ten scholarships will be designated for junior faculty and postdoctoral fellows and their availability will be announced in conference literature; (2) One evening will be dedicated to talks by junior scientists. Employing these options to encourage the participation of junior scientists has worked very well for other Gordon Conferences such as the Proteoglycan Gordon Conference which uses the scholarship mechanism and Molecular Genetics which uses the junior scientists evening option."], ["In the past year, effort in nuclear magnetic resonance imaging has been expanded on several research fronts. A. Following the failure of the magnet manufacturer to install successfully the clinical NMR imaging system in the Department of Radiology, Clinical Center, the research group took over the homogenizing of the magnetic field necessary for clinical imaging, and, using a new technique, achieved the specified homogeneity, thereby enabling the Radiology Department to produce high quality images. B. Considerable controversy exists within the NMR image community over the choice of an optimal field strength for imaging. Bench methods were developed for accurately assessing the signal-to-noise ratio from, and radio-frequency power deposition in, the human body at any frequency used for imaging. It is hoped these results will help resolve the matter. C. A pulse for highly selective spin population inversion has been discovered. The result is of considerable experimental and theoretical importance for it represents only the second known analytical solution of the non-linear differential equations governing the motion of an NMR spin system. Further, above a critical threshold, the inversion is independent of applied power. D. A so-called \"quadrature\" probe system has been invented which reduced radio-frequency power dissipation in the body by almost a factor of 2 while improving signal-to-noise ratio by almost 40%. E. The electronic upper frequency limit for adult head imaging has been pushed from 84 MHz to 130 MHz with the aid of a novel phased-array receiving coil."]]
\ No newline at end of file
{"results": {"pile_nih-exporter": {"word_perplexity": 1.0012351076976023, "byte_perplexity": 1.0001828284896763, "bits_per_byte": 0.00018281177858479804}}, "versions": {"pile_nih-exporter": 0}} {"results": {"pile_nih-exporter": {"bits_per_byte": 0.00024394433346975716, "byte_perplexity": 1.0002439740903082, "word_perplexity": 1.0016712202288802}}, "versions": {"pile_nih-exporter": 0}}
\ No newline at end of file \ No newline at end of file
0f1c23a1f4ddec0c2b1ff34de8d1505b0eb9e2868d8edbcc1b6de13d02f32036
\ No newline at end of file
This source diff could not be displayed because it is too large. You can view the blob instead.
{"results": {"pile_opensubtitles": {"word_perplexity": 1.0000686899705005, "byte_perplexity": 1.0000122723022777, "bits_per_byte": 1.2272226973582225e-05}}, "versions": {"pile_opensubtitles": 0}} {"results": {"pile_opensubtitles": {"bits_per_byte": 1.5213441136639177e-05, "byte_perplexity": 1.0000152135568616, "word_perplexity": 1.0000856162053249}}, "versions": {"pile_opensubtitles": 0}}
\ No newline at end of file \ No newline at end of file
5d6c19665f429ab1ccbe027da67f42bdaf219f819ab093673976eee55e015ff4
\ No newline at end of file
[[".org Jingles\n\nHere you will find the complete text of the original Burma-Shave jingles, as archived in Frank Rowsome Jr.'s book The Verse by the Side of the Road."], ["Abandoned Ontario Mansion Hidden in the Woods\n\nEvery year it seems that we cannot top the amazing abandoned places that were found and discovered that year until something like this incredible abandoned Ontario Mansion comes along.\n\nIt is mind blowing to imagine that with all of the abandoned enthusiasts, urban explorers and curious adventurous photographers out there that this place hasn\u2019t been discovered until now (as far as I know)\n\nThe story of how this place came to be known is that of luck and a fluke, a young explorers grandfather used to work in the area of this house and he knew of it, but had no reason to mention it. Then he learned of his grandsons hobby of exploring and photographing old abandoned houses, the grandfather told his grandson where to find it and to see if it\u2019s still there. He followed the directions provided, hiked through a deep forest and finally found himself in front of a large and overgrown mansion.\n\n\n\n\n\n\n\n\n\nAs much as I would love to provide a deeper history of this amazing abandoned mansion, that would open up the location to people who have less than ideal intentions \u2013 and that is something I\u2019m not willing to do.\n\nI will say, the home had a number of wings and very long corridors, entryways and foyers, Each wing was connected by a unique round room, one of them serving as a bar. The home has a large solarium and ballroom, a courtyard with a (now filled in) swimming pool, a basement gym and Jacuzzi. There were many bedrooms, a wine cellar, kitchen, prep area and large eating area. I could go on and on about the features of the house, but now let\u2019s allow the photographs to do the talking.\n\nAt the bottom of the page I have included a number of videos from outside of the house as a rain storm came through the area as well as some short and fun walk through videos.\n\n\n\n\n\n\n\n\n\nExterior Photos of this Abandoned Ontario Mansion Hidden in the Woods\n\nThe exterior of a large abandoned ontario mansion, tucked away and hidden deep in a wooded area in Ontario The view out the window of the bedroom of a large abandoned ontario mansion, tucked away and hidden deep in a wooded area in Ontario\n\n\n\n\n\n\n\n\n\nAbandoned Ontario Mansion Hidden in the Woods\n\nRound Rooms in Abandoned Ontario Mansion Hidden in the Woods\n\nAbandoned Ontario Mansion Hidden in the Woods\n\nKitchen of a large abandoned ontario mansion, tucked away and hidden deep in a wooded area in Ontario\n\n\n\n\n\n\n\n\n\nPhotoSensory Clips"], ["Steam Keys\n\nTo get your Steam key, just select the game you have bought from us, enter your registered e-mail address and click 'submit'.\n\nIf you lose your key just enter your e-mail again and we'll resend it.\n\nIf you need to update your e-mail address in our database, please contact us and we'll sort it out as soon as we can."], ["The number one place for Gambino Merch. Whether you call him Donald, Gambino, Troy or Earn, we all love this guy."], ["Rescue workers help a trapped woman in Point Vernon after cutting the roof off the four-wheel driver she was a passenger in.\n\nRescue workers help a trapped woman in Point Vernon after cutting the roof off the four-wheel driver she was a passenger in. Roderick Makim\n\nA WOMAN was taken to hospital after a crash at a notorious traffic black spot in Hervey Bay.\n\nThe crash occurred at the intersection of Martin and Tooth Sts in Point Vernon.\n\nA white Ford four-wheel drive allegedly failed to give way and crashed into a black Hyundai Excel.\n\nNearby residents rushed to help seven people involved in the crash - four in the Ford and three in the Hyundai.\n\nA woman was trapped in the Ford and firefighters had to cut the roof off before she could be taken by ambulance to Hervey Bay Hospital.\n\nA police spokesman said the woman's injuries were believed to be minor.\n\nRon Berrell was one of the people to help passengers out of the Ford, including three teenage children.\n\n\"There were quite a few people helping out,\" he said.\n\nThere had been many crashes at the intersection in the years he had lived nearby, Mr Berrell said.\n\nHe and other residents told the Chronicle there was a need for traffic lights or a roundabout at the intersection.\n\nThe Fraser Coast Regional Council was already looking into the possibility.\n\n\"It has been deemed a problem area,\" Mayor Gerard O'Connell said.\n\nLeonard Gleeson, acting station officer from the Hervey Bay fire station, said the crash was \"a reminder for drivers to take care around school times\".\n\nPolice said the male driver of the Ford was given a traffic infringement notice for failing to give way."], ["\u201cDoes everyone know where the bike\u2019s back break is?\u201d\n\nI breathe a sigh of relief. So far my introductory course to downhill mountain biking at Winter Park Resort\u2019s Trestle Bike Park in Colorado is manageable.\n\nDespite knowing the answers to the preliminary questions, however, my heart is beating out of its chest and my blood pressure feels like one of those strong man carnival games where you pound a puck with a mallet to hit the top bell.\n\nI\u2019m out of breath and we haven\u2019t even left the bike shop yet.\n\nThe full-face helmet, knee and shin guards, gloves, and full-body padding warn me the day is about to get more adventurous \u2026 by a lot.\n\nDespite riding my bike almost daily around NYC and exploring trails around the USA like Idaho\u2019s Route of the Hiawatha trail, downhill mountain biking has always been a fear of mine. It just looks so out of control, with so many possibilities for what could go wrong:\n\nFalling off a mountain, bashing into a tree, tumbling over the handlebars, simply looking like an idiot.\n\nDeep breathing and lots of water help calm me down, and I vow not to let myself back out.\n\nFunny enough, I didn\u2019t sign up for the lesson because I thought I would be the best \u2014 although one can hope; instead, I booked this excursion because I wanted to face a fear. I wanted to use travel for one of its most powerful purposes: growth.\n\nMy previous days in Colorado\u2019s Grand County had been full of Rocky Mountain hiking, high altitude kayaking and dude ranch barbecuing \u2014 all things I loved, and knew I would love when deciding to do them.\n\nDownhill mountain biking was something that could very well bring me to tears, break my bones or just be something I hated. It could also be something I excelled at, made it through with a smile and absolutely loved. We would soon find out.\n\nWorst came to worst, I would leave with some gnarly scars and a great story to tell. See, I like to think positive.\n\nDownhill Biking: Learning The Basics\n\nAfter a chat about safety and how to properly downhill bike, our group of eight, six students and two instructors, Christy and John, spend some time on the bunny hill practicing our strong stance \u2014 standing tall without locked knees, arms wide on the bars with elbows out, body slightly back, and pedals flat on an even plane.\n\nWe also talk about our experience with downhill biking. Let\u2019s just say my title of \u201curban commuter cyclist\u201d wasn\u2019t going to get me any street mountain cred with these folks.\n\nI feel pretty good about my stance, although will every passing moment I feel myself getting farther from the bunny hill and closer to the day\u2019s terrifying adventure.\n\nWhich begins with the lift.\n\nI actually enjoy heights, and thus enjoy chairlifts. While I\u2019ve taken chairlifts skiing, the downhill bike chairlifts alternate bike and human chairs.\n\nTo get my bike on, I follow the chair as it drifts away from me and run my bike up a metal ramp slot until both wheels are secure.\n\nAs I use all my strength \u2014 which I\u2019ll admit isn\u2019t much \u2014 to get the bike into the slot the front wheel awkwardly turns away from where it needs to go. Luckily, John is there to grab the bike and fix it.\n\nShake it off, Jessie.\n\nThe next chairlift picks me up, and I forget fear during the 10-minute flight, views of lodgepole pines, Parry Peak (which looks oddly like a catcher\u2019s mitt), and the Continental Divide captivating my mind.\n\nWhen I get off the lift, I\u2019m standing at 10,500 feet (3,200 meters).\n\nIt\u2019s from this high altitude that I\u2019ll begin downhill biking.\n\nThe Adventure Begins\n\nTrestle Bike Park features over 40 miles (64 kilometers) of mountain bike trails, although there is only one green (beginner) trail. Like skiing, trails are labeled by colors \u2014 green, blue and black, along with special blue blacks and double blacks \u2014 with much of the park\u2019s terrain designated for advanced riders.\n\nThe single-track green trail I\u2019d be embarking on, called Green World, is about 5.5 miles (nine kilometers) long, with plenty of action along the way to satisfy any adrenaline junkie.\n\nJohn leads while Christy pulls up the rear, with students cycling over a downhill wooden rollover \u2014 I have a bit of a panic attack with this \u2014 as well as rough and rolling terrain.\n\nI\u2019m not used to the bumps and rocks, and find myself staring down at the ground as well as at the many pines I don\u2019t want to crash into.\n\nI can\u2019t stress enough how bad this is. When downhill mountain biking you need to look at the path ahead, focusing on where you want the bike to go.\n\nYour bike tends to follow your eyes, so if you\u2019re staring at the scary rocks and cliff ledges you don\u2019t want to interact with you may find yourself doing just that.\n\nWhile the instructors have told us this, it\u2019s much easier said than done. Especially when we get to the trail turns.\n\nWe\u2019re told to look ahead and to gently brake before the turn \u2014 not during or things can get hairy. Braking is possibly one of the most important skills to master for downhill biking.\n\nIt should be gentle with one finger, assuming your bike has good brakes, which it should. Brake too hard with your rear break \u2014 the one on the right (in the US, different for motorcycles) \u2014 and you\u2019ll just damage the bike and the trail.\n\nBrake too hard with your front brake and you\u2019ll likely find yourself flipped over the handlebars when the bike halts to a standstill. It\u2019s truly an art to master.\n\nAnd I do, albeit shakily, and not without yelping like a wounded animal. I\u2019m sweating so much from anxiety I need to remove my sunglasses and pour water on my face.\n\nI\u2019ve skydived, bungee jumped, zip-lined, rock climbed, swam cage-less with hammerheads and gone canyoning without a second thought; but, for some reason, this downhill biking seriously freaks me out.\n\nThere\u2019s something about being in control of your own life using a skill you\u2019re not sure you possess that\u2019s seriously terrifying.\n\nThe thing is, I\u2019ve now gone up the chairlift and am in the middle of the woods, on a downhill single-track trail that still has five miles to go. There\u2019s no turning back. I\u2019ve either got to suck it up, believe in myself and get on with it, or ask to be air rescued.\n\nLuckily, I\u2019ve got a little too much pride for the latter.\n\nWhen Being The Worst Is A Good Thing\n\nDespite my worries of being the worst in the group, it\u2019s soon something I really just have to own, especially when John splits us up into two groups \u2014 one of five, and one of one. Me. He\u2019s trying so hard to be nice about the fact I\u2019m 10 times slower than everyone else that I have to laugh.\n\n\u201cIt\u2019s okay, I\u2019m fine with being the worst,\u201d I laugh. \u201cI\u2019m glad for the one-on-one practice.\u201d\n\nThe rest of the group is supportive, wishing me luck and telling me I\u2019m doing great. Soon, they\u2019re on their way and I\u2019m left alone with my awesome instructor for the day, Christy.\n\nI\u2019m not sure why I was so afraid of being the worst, as it\u2019s actually landed me a private lesson. I already feel more comfortable simply not having other eyes on me.\n\nI\u2019m still a bit shaky, and Christy tells me that I\u2019m doing everything right, technically, but that I need to relax. I\u2019m rigid and fearful, and that alone is what is affecting my performance.\n\nMy confidence in my abilities doesn\u2019t come quickly or easily; however, after mile two, my heart has stopped beating out of my ears and I\u2019m actually having fun.\n\nAnd I\u2019m downhill biking! I\u2019m really doing it.\n\nFinding My Confidence\n\nCertain sections are harder than others, and we go over numerous segments with wide sharp turns, steep declines, huff-inducing inclines and huge rocks that just that morning I never would have believed I could do.\n\nAt one point we reach a particularly precious looking turn that I just can\u2019t bring myself to do \u2014 literally stopping short at the top each time I try \u2014 and end up walking my bike around it, instantly feeling angry at myself for giving up.\n\n\u201cThere\u2019s another one coming up that\u2019s slightly different than that one, so you\u2019ll have a chance to try again,\u201d says Christy.\n\nWhen we get to the second turn, I give myself a head start with pedaling, take a deep breath, utter a \u201cyou can do this\u201d mantra and \u2026 go.\n\nI easily make it around the scary curve unscathed and with a smile.\n\nChristy smirks, \u201cRemember how I said that turn was slightly different? It was steeper and longer. I just didn\u2019t want to freak you out. You did it!\u201d\n\nAfter that my confidence shoots through the roof. I feel invincible, simply because I successfully tackled something I wasn\u2019t quite sure I could master.\n\nThe last leg of the trail introduces me to rollers, essentially a pattern of up and downhill slopes. For this, Christy teaches me how to \u201cpump\u201d the rollers, which means moving your body up and down in tune with the slopes, providing speed and smoothness for the ride. While I wouldn\u2019t say I mastered them, I feel like I got the hang of them to the point where I was proud.\n\nThe trail ends with level mountain trails, until the very end where there\u2019s a long downward slope where I think I might bite the dust. But, I don\u2019t. And I finish in one piece.\n\nI could have easily spent the day at Winter Park Resort riding the alpine slide or doing the maze \u2014 which I do, and are awesome \u2014 but I also wanted an experience that would take me out of my comfort zone and help me grow as a person.\n\nDownhill biking at Trestle Bike Park provided exactly that in an active and fun way, and introduced me to something I wasn\u2019t great at when I started, but feel I could somewhat master with more practice.\n\nLooks like I have a new activity to incorporate into my upcoming travels.\n\nHave you ever tried downhill mountain biking in Colorado?\n\nMy trip was sponsored by Visit Colorado. I was not required to write this post nor was I compensated for it. As always, all opinions are my own, and all writings are based on my personal experiences in the destination.\n\nLogistics: Getting There & Around: Travelers can fly into Denver International Airport. From here, it\u2019s about a two-hour drive to Winter Park Resort. If you just plan on spending time at the resort and exploring their mountain sports and onsite rides and restaurants, there\u2019s no need for a car, especially in winter when their free shuttle takes you around town, as well; however, summer travelers wanting to explore beyond the resort or those wanting to do a longer trip should rent a car. I recommend Home James Transportation for a shuttle service, and Avis for car rentals. For the trip between Denver International Airport and Winter Park Resort the cost was $69. Health: Because Colorado is a high altitude state, make sure to hydrate before and during your trip. Eight glasses is not enough, and altitude sickness can creep up on you without warning. Travel Insurance: I recommend taking out a plan with Allianz Global Assistance. Language: English Currency: US Dollars Fun Fact: Charles Lindbergh and his wife, author Anne Morrow, spent time in Grand County with friend, Harry Knight, whose place is now covered by Granby Reservoir.\n\n\n\nBonus Colorado Adventure Resources:\n\nEpic Adventures In Colorado\u2019s Grand County\n\n5 Ways To Experience Aspen Beyond Skiing\n\nConquering My Fear Of Heights At Portland Creek Canyon, Colorado\n\nThese Photos Will Make You Want To Hike Vail Mountain Right Now"], ["(Reuters) - Haitian leaders pleaded for calm on Saturday as violent protests over fuel price increases entered a second day and U.S. airlines canceled flights to the Caribbean nation.\n\nPrime minister Jack Guy Lafontant listends to President Jovenel Moise's speech during Lafontant's presentation in the National Palace of Port-au-Prince, Haiti, February 24, 2017. REUTERS/Andres Martinez Casares\n\nPrime Minister Jack Guy Lafontant announced the temporary suspension of double-digit government hikes to prices for gasoline, diesel and kerosene on Saturday afternoon - just a day after they were announced.\n\nBut as local television footage showed, the government\u2019s decision to back down did not keep angry residents from taking to the streets. Some demonstrators erected flaming roadblocks, while others attacked hotels and businesses.\n\n\u201cThe poor people want to be able to eat,\u201d one masked protester told Reuters TV as a car blazed behind him. \u201cI want to tell (President) Jovenel (Mo\u00efse) that Haiti is not for him and his family. Haiti is for every Haitian. He needs to leave the country and leave the country to us so we can live.\u201d\n\nIn a statement, Lafontant said the government strongly condemns the acts of violence and vandalism.\n\nU.S. carriers American Airlines AAL.O, JetBlue JBLU.O and Spirit Airlines SAVE.N announced flight cancellations Saturday to the capital Port-au-Prince citing civil unrest.\n\n\u201cDue to concerns over safety from unrest in the area, Spirit Airlines felt it necessary to temporarily suspend service to Port-au-Prince, Haiti Saturday,\u201d the airline said in a statement. \u201cWe apologize for the inconvenience this has caused, but the safety of our guests and crew is paramount.\u201d\n\nA spokesman for the airline said it was not yet clear when flights would resume.\n\nThe U.S. Embassy in Haiti advised personnel and Americans in the country to shelter in place.\n\nThe U.S. State Department said separately that it was aware of vandalism at a Best Western hotel, where media reports said Americans were staying, and at an American Airlines office in downtown Port-au-Prince.\n\n\u201cAt this time, we have not received any reports of U.S. citizens injured in the incidents,\u201d the State Department said in a statement.\n\nOn Friday, Haiti\u2019s Commerce and Economic ministries announced that fuel price increases, including a 38 percent jump for gasoline and 47 percent for diesel, would take effect at midnight.\n\nThe now-suspended decision by Mo\u00efse\u2019s government to raise prices was part of an agreement with the International Monetary Fund, which requires the country to enact a range of austerity measures."], ["[Haskell-cafe] Interest in a Mathematics & AI strike force ?\n\nHello -cafe, When I started learning Haskell, I saw the AI page [1] which aimed at creating a sound, uniform and handy framework for AI programming in Haskell. I added my name on it and thought a bit about it. I even wrote a first version of HNN [2], a neural network library, quite early in my Haskell days. I found that idea to be great but did not see any actual effort around this. So, I'm now thinking again about that and even enlarging it to mathematics & AI. Thus, I would like to have an idea of the number of people interested in being involved in such an effort. There are several tools out there on hackage but they aren't that much uniform and neither play nicely together. I'm pretty convinced this could be improved and as a Mathematics student I'm highly interested in that. If enough people are interested, we could for example set up a mailing list and a trac to organize the effort and then people could just discuss and write Haskell modules when time permits. Any comment, idea, reaction, interest ? [1] http://www.haskell.org/haskellwiki/AI [2] http://www.haskell.org/haskellwiki/HNN -- Alp Mestanogullari http://alpmestan.wordpress.com/ http://alp.developpez.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20100503/fdf28735/attachment.html"], ["Part of his approach is strategic, said the media analyst Clare Enders, founder of Enders Analysis. Ms. Enders suggested that Ms. Brooks functioned as something of a firewall for Mr. Murdoch \u2014 a buffer against the allegations. \u201cIf she resigns, that\u2019s an admission of culpability,\u201d she said.\n\nAnd part is emotional. \u201cRupert Murdoch adores her \u2014 he\u2019s just very, very attached to her,\u201d said a person who knows them both socially. \u201cTo be frank, the most sensible thing that News Corp. could do would be to dump Rebekah Brooks, but he won\u2019t.\u201d\n\nMs. Brooks\u2019s rise has been steady, and quick. She began her career in the Murdoch media stable as a secretary at The News of the World, rising to become editor of the paper just 11 years later. In 2003, she became editor of the tabloid Sun, Britain\u2019s best-selling daily newspaper, before being promoted to her current job two years ago.\n\nFrom early on, she was known for her creative flair in getting articles and her lack of compunction in how she got them. In 1994, she prepared for The News of the World\u2019s interview with James Hewitt, a paramour of Princess Diana, by reserving a hotel suite and hiring a team to \u201ckit it out with secret tape devices in various flowerpots and cupboards,\u201d Piers Morgan, her former boss and now a CNN talk show host, writes in his memoir \u201cThe Insider.\u201d\n\nOn another occasion in her early days, furious that the paper was about to be scooped by The Sunday Times\u2019s serialization of a biography of Prince Charles, Ms. Brooks disguised herself as a Times cleaning woman and hid for two hours in a bathroom, according to Mr. Morgan. When the presses started rolling, she ran over, grabbed a newly printed copy of The Sunday Times, and brought it back to The News of the World \u2014 which proceeded to use the material, verbatim, in its own paper the next day.\n\nSuch tales have passed into tabloid legend, as has Ms. Brooks\u2019s uncanny knack for cultivating the powerful. She was a confidante of Cherie Blair, the wife of Prime Minister Tony Blair, at a time when Mr. Murdoch supported the Labour government. She broke the story of Mrs. Blair\u2019s pregnancy during her husband\u2019s premiership. When the political winds changed and Mr. Murdoch decided to back the Conservative Party, many of Ms. Brooks\u2019s Labour friends felt she had betrayed them and stopped socializing with her. Ms. Brooks simply switched friends.\n\nNow, she and her husband, Charlie Brooks, a former horse trainer, are part of a high-powered coterie that includes Prime Minister David Cameron and his wife, Samantha; Mr. Murdoch\u2019s daughter Elisabeth and her husband, the public relations executive Matthew Freud; and James Murdoch, Mr. Murdoch\u2019s heir apparent, and his wife."], ["| By\n\nOff the microphone of RE\n\nFollow us on Twitter @doomstead666\n\nFriend us on Facebook\n\nAired on the Doomstead Diner on July 22, 2015\n\nVisit the New Diner News Page for Daily Updates from around the Collapse Blogosphere\n\nGet the End of More on Amazon.com\n\nDiscuss this article at the Podcast Table inside the Diner\n\nRecently, we had the opportunity to talk with Norman Pagett, one of the Authors of the End of More, and excellent Primer for people new to the world of Industrial Civilization Collapse and Population Overshoot. Norman resides in Shropshire, England, right at the heart of where the Industrial Revolution began in the early 1700s with the invention of the Steam Engine, and its early application in pumping the water out of Coal Mines.\n\nIn this first part of our discussions with Norman, we go over the early history of the Industrial Revolution and its expansion in the early years.\n\nMuch more to come in future episodes. We have a few hours of collapse chat still to wade through and edit here. Meanwhile, enjoy our Collapse analysis of the day here on the Doomstead Diner.\n\nRE\n\nSnippet:\n\nRE: \u2026I don't know how much do you followed any of the old Dickens stories about the dirty state of London back in the early nineteenth century as a result of coal burning?\n\nNorman: Yes I do. In fact two things which expanded London and other cities as well because all of them was the go to transport that's rail transport and the output of sewage, because if you've got a city with a million people in it you've got an awful lot of sewage and you've got to get rid of it, and the only way you can get rid of it was building a sewage system which could only be built with bricks, and the heat needed in vast quantities could only came from coal. So coal firms were about sixty or seventy miles from London where the bricks were fired and they had to be transported into the city by train, and then from that they use the six million bricks to build the London sewage system, which was then pumped out from the London Centre right to the estuary on the North Sea, and then the big engines out there which pumped the sewage into the sea and was just discharged and got rid of . Now again you're talking not just about pumping the water out of the coal mines you're talking about pumping water into the city fresh water in some way and then taking sewage and pumping it out of the city. So those two processes then enable cities to start growing to much larger sizes than they had ever before and so that was a prrocess there that allowed the sytem to take off\u2026\n\nFor the rest, LISTEN TO THE INTERVIEW!!!\n\n"]]
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment