USA vs Canada womens hockey Best imagefrom an Olympic classic

USA vs. Canada womens hockey: Best images from an Olympic classic

Team USA ends Canadas gold-medal streak with a 3-2 shootout victory in Pyeongchang. Relive the thrilling matchup through these images selected by SN.

Useleft or rightbuttons to navigate.

Useleft or rightbuttons to navigate.

Four years after losing the gold-medal game in heartbreaking fashion to its archrival, the U.S. Olympic womens hockey team earned redemption withan exhilarating 3-2 shootout victory over Canadain the 2018 gold-medal game in Pyeongchang, South Korea.

The main heroes were Jocelyne Lamoureux-Davidson, who scored the game-winner in the shootout; her twin sister, Monique Lamoureux-Morando, whose goal tied the game in the third period; Amanda Kessel, whose shootout goal kept the game going; and Maddie Rooney, the 20-year-old goalie who held Canada at bay after giving up the lead in the second period.

MORE:U.S. winning goalhas a familiar name

The victory secured the U.S. womens first Olympic gold medal since 1998 and ended Canadas run of consecutive Olympic golds at four.

Sporting News put together a collection of the top 13 images from the latest showdown between the two best teams in international womens hockey.

Hilary Knight has been seeking redemption ever sinceher penalty led to Canadas gold medal-winning overtime goalin the 2014 gold-medal game. She earned some in the first period of the 2018 championship match: Her tip-in goal on a U.S. power play opened the scoring.

Canadian captainMarie-Philip Poulin put her squad ahead 2-1 with a goal in the second period. It was Poulins fifth goal in Olympic gold-medal games, and it looked for long time as though it might be the decider in this one.

Poulin then became the center of controversy with this contact to the head of Team USAs Brianna Decker, denying Decker a scoring chance in the third period. Poulin was not penalized (U.S. coach Robb Stauber screamed to officials that Poulin should have been given a five-minute major).Decker was able to stay in the game.

Team USA pressed for the tying goal for most of the third period; it final succeeded at 13:39 of the frame. Monique Lamoureux-Morando took advantage of a bad line change by the Candians and beat goalie Shannon Szabados on a breakaway. The scored remained 2-2 until the shootout.

Melodie Daoust was set to be the hero after she gave Canada a 2-1 lead in Round 4 of the shootout. Daoust beat Rooney with a Forsberg, a move made famous by Hockey Hall of Famer Peter Forsberg.

Amanda Kessel got that goal right back for Team USA. The sister of Penguins star Phil Kessel beat Szabados to the glove side in Round 4 to tie the shootout 2-2.

Thats the name of the move by Jocelyne Lamoureux-Davidson that produced the go-ahead goal in Round 6 of the shootout. Forehand deke, backhand deke, forehand, score. Szabados had no chance.

Rooney made Lamoureux-Davidsons goal stand up when she denied Meghan Agosta in Round 6. The 20-year-old Rooney made 29 saves in regulation and OT, and kept four more pucks out of the net in the shootout.

Rooneys teammates rushed onto the the ice after she made the game-ending save. The Americans had ended their rivals Olympic dominance and ended a 20-year gold drought, to boot.

The champs displayed Old Glory after earning the gold. Team USAs love of country created a controversy prior to the start of the tournament: The International Olympic Committeedebated whether to orderthe removal of artwork depicting the Statue of Liberty from their masks of goalies Nicole Hensley and Alex Rigsby. The IOC eventually allowed the artwork to remain on the masks.

Kendall Coynesfiance, Chargers offensive lineman Michael Schofield, gives Coyne the touchdown-celebration treatment after Team USAs victory.

Angela Ruggiero, who played on the first U.S. womens gold-medal team in 98, puts a 2018 gold medal around the neck of Team USA hero Monique Lamoureux-Morando. Ruggiero is now a member of the IOC.

Members of the championship squad sing The Star-Spangled Banner as the American flag is raised above the ice during the medal ceremony.

KRd vs Rd

(carfilzomib) is indicated in combination with dexamethasone

or with lenalidomide plus dexamethasone for the treatment of patients with relapsed or refractory multiple myeloma who have received one to three lines of therapy.

is indicated as a single agent for the treatment of patients with relapsed or refractory multiple myeloma who have received one or more lines of therapy.

Kd = KYPROLIS®(carfilzomib) and dexamethasone; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; Rd = lenalidomide and dexamethasone; Vd = VELCADE®(bortezomib) and dexamethasone.

Kd = KYPROLIS®(carfilzomib) and dexamethasone; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; Rd = lenalidomide and dexamethasone; Vd = VELCADE®(bortezomib) and dexamethasone.

Kd = KYPROLIS®(carfilzomib) and dexamethasone; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone.

PLEASE SEE THE IMPORTANT SAFETY INFORMATION IN THE SECTION BELOW.

In clinical studies, nearly all patients (90%) taking Vectibix®experienced skin rash or other skin reactions. Severe or life-threatening skin reactions have been reported.

Skin reactions included (but were not limited to):

percent of these patients had severe skin reactions, which involved pain, disfigurement, ulceration, or loss of outer layers of skin. Some patients who developed severe skin reactions also developed infections in the blood, skin, fat, or tissue that sometimes resulted in death.

doctor may decrease your dose, delay your next dose, or stop Vectibix®treatment altogether to manage your side effects. It is important that you tell your doctor right away if you have any skin reactions or any signs of infection (such as chills, fever, or increased redness or swelling of an existing skin reaction).

with RAS-mutant mCRC should not take Vectibix®. Investigations of the clinical studies that used Vectibix®showed that Vectibix®exposed patients with RAS-mutant mCRC to serious side effects without working to treat cancer.

patients who were taking Vectibix®developed low levels of certain electrolytes, including:

doctor may check the levels of these electrolytes in your blood while you are on treatment and for 2 months after you finish treatment. Your doctor may add other oral or intravenous medications to your Vectibix®treatment.

Vectibix®is given by infusion into a vein. Some patients may develop an infusion reaction, which can be severe and in rare cases has resulted in death. Infusion reactions developed in 4% of patients in one clinical trial, and 1% of patients experienced serious infusion reactions. Infusion reactions included:

on how severe the reaction is, your doctor may decide to slow the rate of the infusion, stop the infusion, or stop your Vectibix®treatment completely.

Tell your doctor right away if you experience severe diarrhea or dehydration. Some patients treated with Vectibix®and chemotherapy developed kidney failure or other complications because of severe diarrhea and dehydration.

Lung disease, including fatal lung disease, occurred in 1% or less of patients who had taken Vectibix®. Tell your doctor if you have problems breathing, wheezing, or a cough that doesnt go away or keeps coming back. If you have been told that you have had lung problems in the past, be sure to tell your doctor. Your doctor may decide to stop Vectibix®treatment.

Being in the sun may make skin reactions worse. Wear sunscreen and protective clothing (like a hat), and avoid direct sunlight while you are on treatment with Vectibix®. Tell your doctor if you have new or worsening skin reactions.

Inflammation of the eye and injury to the cornea have been reported. Tell your doctor if you have any vision changes or eye problems.

Patients treated with Avastin®(bevacizumab) and Vectibix®together did not live as long and had more serious side effects, such as acne-like rash, diarrhea, dehydration, painful ulcers and mouth sores, and low levels of potassium and magnesium in the blood. Some patients developed blood clots that can travel to the lungs, which can be very serious or even fatal. Do not take Avastin®with Vectibix®.

Use effective birth control to avoid pregnancy while taking Vectibix®and for 6 months after the last dose. It is possible for a pregnant patient to transfer Vectibix®to an unborn child, which could be harmful to the unborn child.

Vectibix®could also be transferred to a child through breast milk. Your doctor may tell you that you should not nurse your baby during Vectibix®therapy and for 2 months after your last dose of Vectibix®.

Women who become pregnant during Vectibix®treatment are encouraged to enroll in Amgens Pregnancy Surveillance Program. Women who are nursing during Vectibix®treatment are encouraged to enroll in Amgens Lactation Surveillance Program. Call 1 (800) 772-6436 to enroll.

In clinical studies using Vectibix®alone, the most common side effects were severe skin reactions, nail infections, lack of energy, nausea, and diarrhea. The most common serious side effects were general declining health and blockage of the bowel.

In clinical studies using Vectibix®with FOLFOX, the most commonly reported side effects for wild-type KRAS patients were diarrhea, painful mouth swelling, swelling/redness of the inner lining of the mouth, lack of energy, nail infection, lack of hunger, unusual magnesium and potassium levels in the blood, rash, acne-like rash, severe itching, and dry skin. The most serious side effects reported in Vectibix®-treated wild-type KRAS patients were diarrhea and dehydration.

Tell your doctor right away if you have any side effects such as worsening skin problems, eye problems, fever, chills, breathing problems (such as a cough that doesnt go away or keeps coming back, wheezing, or shortness of breath), if you develop diarrhea or become dehydrated, or if you become pregnant.

Do not change or stop any medications you may be taking (including over-the-counter drugs or supplements you can buy without a prescription) without first speaking with your doctor.

Please read the full Prescribing Information and discuss it with your doctor.

Superior median progression-free survival: 26.3 months for KRd vs 17.6 months with Rd

to Rd significantly increased median progression-free survival by 8.7 months

Primary endpoint: progression-free survival

Carfilzomib (KYPROLIS®) in combination with lenalidomide and dexamethasone (KRd) has a category 1 designation in the NCCN Clinical Practice Guidelines in Oncology (NCCN Guidelines®) for Multiple Myeloma (Version 4.2018) for previously treated multiple myeloma.1,2

CI = confidence interval; HR = hazard ratio; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; NCCN = National Comprehensive Cancer Network; PFS = progression-free survival; Rd = lenalidomide and dexamethasone.

Post hoc analysis:demonstration of progression-free survival at 18 months was not a study objective.

KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; PFS = progression-free survival; Rd = lenalidomide and dexamethasone.

Post hoc analysis:demonstration of progression-free survival efficacy within these subgroups was not a study objective. The study was not powered to evaluate progression-free survival efficacy within each of these subgroups.

KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; PFS = progression-free survival; Rd = lenalidomide and dexamethasone.

KRd demonstrated a 9.1-month improvement in median progression-free survival over Rd alone in patients who had received 2 or 3 previous lines of therapy

Post hoc analysis:demonstration of progression-free survival within these subgroups was not a study objective. The study was not powered to evaluate progression-free survival efficacy within each of these subgroups.

KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; PFS = progression-free survival; Rd = lenalidomide and dexamethasone.

KRd significantly increased median overall survival by 7.9 months vs Rd alone (KRd 48.3 months vs 40.4 months Rd alone)

and Rd reduced the risk of death by 21% compared to Rd alone

Secondary endpoint: overall survival

HR (KRd/Rd) = 0.79 (95% CI: 0.67-0.95);

CI = confidence interval; HR = hazard ratio; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; OS = overall survival; Rd = lenalidomide and dexamethasone.

Overall survival (OS) was a prespecified key secondary efficacy endpoint. The significance level of the preplanned OS second interim analysis is determined by the OBrien-Fleming type alpha spending function based on the number of OS events observed by the analysis time. Per protocol, patients received 18 cycles of KYPROLIS®with Rd and then continued treatment with Rd alone to progression.5,6

The KRd vs Rd OS results have not yet been reviewed by FDA, and inclusion in the final, FDA-approved label for KYPROLIS®has yet to be determined.

Adding K to Rd tripled the patients chance of achieving a complete response or better (KRd vs Rd)

Secondary endpoint: responses by category

CR as defined below plus normal free light chain (FLC) ratio* and absence of clonal cells in bone marrow biopsy by immunohistochemistry

Negative immunofixation on the serum and urine and disappearance of any soft tissue plasmacytomas and

Serum and urine M-protein detectable by immunofixation but not on electrophoresis or 90% reduction in serum M-protein plus urine M-protein level

50% reduction of serum M-protein and reduction in 24-hour urinary M-protein by 90% or to

If the serum and urine M-protein are unmeasurable, a 50% decrease in the difference between involved and uninvolved FLC levels is required in place of the M-protein criteria

If serum and urine M-protein are unmeasurable, and serum free light assay is also unmeasurable, 50% reduction in plasma cells is required in place of M-protein, provided baseline bone marrow plasma cell percentage was 30%. In addition to these criteria, if present at baseline, a 50% reduction in the size (SPD)of soft tissue plasmacytomas is also required

Adapted by permission from Elsevier Limited:Lancet Oncol. 2016;17:e328-e346. © 2016

*All recommendations regarding clinical uses relating to serum FLC levels or FLC ratio are based on results obtained with the validated Freelite test (Binding Site, Birmingham, UK).

Presence/absence of clonal cells on immunohistochemistry is based upon the kappa/lambda ratio. An abnormal kappa/lambda ratio by immunohistochemistry requires a minimum of 100 plasma cells for analysis. An abnormal ratio reflecting presence of an abnormal clone is kappa/lambda of>

4:1 or

Plasmacytoma measurements should be taken from the CT portion of the PET/CT, or MRI scans, or dedicated CT scans where applicable. For patients with only skin involvement, skin lesions should be measured with a ruler. Measurement of tumor size will be determined by the SPD.

CR = complete response or better; CT = computed tomography; IMWG = International Myeloma Working Group; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; MRI = magnetic resonance imaging; ORR = overall response rate (CR or better + VGPR + PR); PET = positron-emission tomography; PR = partial response; Rd = lenalidomide and dexamethasone; SPD = sum of the products of the maximal perpendicular diameters of measured lesions; VGPR = very good partial response or better.

Median time from treatment start to CR was 6.7 months for KRd patients and 8.3 months for Rd patients

demonstration of CR over time was not a study objective

CR = complete response; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; Rd = lenalidomide and dexamethasone.

complete response rates at first relapse (KRd vs Rd study)

In patients who had received 2 or 3 previous lines of therapy, rates of CR or better were 30.2% for KRd vs 10.9% for Rd

demonstration of CR over time was not a study objective

CR = complete response; KRd = KYPROLIS®(carfilzomib), lenalidomide, and dexamethasone; Rd = lenalidomide and dexamethasone.

KRd vs Rd study:A phase 3, randomized, open-label, multicenter superiority study evaluated KYPROLIS®in combination with lenalidomide and dexamethasone (KRd) vs lenalidomide and dexamethasone (Rd) in patients with relapsed or refractory multiple myeloma who had received 1 to 3 prior lines of therapy. 792 patients were randomized in a 1:1 ratio (396 patients to KRd, 396 to Rd). Patients received their randomized study treatment in 28-day cycles until disease progression or unacceptable toxicity occurred. KYPROLIS®was discontinued after Cycle 18. The primary endpoint was progression-free survival (PFS); select secondary endpoints included overall survival (OS), overall response rate (ORR), duration of response (DoR), and safety. The significance level of the preplanned OS second interim analysis is determined by the OBrien-Fleming type alpha spending function based on the number of OS events observed by the analysis time.1,5,6

1.KYPROLIS®(carfilzomib) prescribing information, Onyx Pharmaceuticals Inc., an Amgen Inc. subsidiary.2.Referenced with permission from the NCCN Clinical Practice Guidelines in Oncology (NCCN Guidelines®) for Multiple Myeloma V.4.2018. © National Comprehensive Cancer Network, Inc. 2018. All rights reserved. Accessed February 27, 2018. To view the most recent and complete version of the guideline, go online to . NATIONAL COMPREHENSIVE CANCER NETWORK®, NCCN GUIDELINES, and all other NCCN Content are trademarks owned by the National Comprehensive Cancer Network, Inc.3.Data on file, Amgen; 2014.4.Dimopoulos MA, Stewart AK, Masszi T, et al. Carfilzomib-lenalidomide-dexamethasone vs lenalidomide-dexamethasone in relapsed multiple myeloma by previous treatment.Blood CancerJ. 2017;7:e554.5.Siegel DS, Dimopoulos MA, Ludwig H, et al. Improvement in overall survival with carfilzomib, lenalidomide, and dexamethasone in patients with relapsed or refractory multiple myeloma.J Clin Oncol. 2018;36:728-734.6.Stewart AK, Rajkumar SV, Dimopoulos MA, et al. Carfilzomib, lenalidomide, and dexamethasone for relapsed multiple myeloma.NEnglJMed. 2015;372:142-152.7.Kumar S, Paiva B, Anderson KC, et al. International Myeloma Working Group consensus criteria for response and minimal residual disease assessment in multiple myeloma.Lancet Oncol. 2016;17:e328-e346.8.Dimopoulos M, Wang M, Maisnar V, et al. Response and progression-free survival according to planned treatment duration in patients with relapsed multiple myeloma treated with carfilzomib, lenalidomide, and dexamethasone (KRd) versus lenalidomide and dexamethasone (Rd) in the phase III ASPIRE study.J Hematol Oncol. 2018;11:49.

NCCN makes no warranties of any kind whatsoever regarding their content, use, or application and disclaims any responsibility for their applications or use in any way.

New onset or worsening of pre-existing cardiac failure (e.g., congestive heart failure, pulmonary edema, decreased ejection fraction), restrictive cardiomyopathy, myocardial ischemia, and myocardial infarction including fatalities have occurred following administration of KYPROLIS. Some events occurred in patients with normal baseline ventricular function. Death due to cardiac arrest has occurred within one day of KYPROLIS administration.

Monitor patients for clinical signs or symptoms of cardiac failure or cardiac ischemia. Evaluate promptly if cardiac toxicity is suspected. Withhold KYPROLIS for Grade 3 or 4 cardiac adverse events until recovery, and consider whether to restart KYPROLIS at 1 dose level reduction based on a benefit/risk assessment.

While adequate hydration is required prior to each dose in Cycle 1, monitor all patients for evidence of volume overload, especially patients at risk for cardiac failure. Adjust total fluid intake as clinically appropriate in patients with baseline cardiac failure or who are at risk for cardiac failure.

Patients 75 years, the risk of cardiac failure is increased. Patients with New York Heart Association Class III and IV heart failure, recent myocardial infarction, conduction abnormalities, angina, or arrhythmias may be at greater risk for cardiac complications and should have a comprehensive medical assessment (including blood pressure control and fluid management) prior to starting treatment with KYPROLIS and remain under close follow-up.

Cases of acute renal failure, including some fatal renal failure events, and renal insufficiency adverse events (including renal failure) have occurred in patients receiving KYPROLIS. Acute renal failure was reported more frequently in patients with advanced relapsed and refractory multiple myeloma who received KYPROLIS monotherapy. Monitor renal function with regular measurement of the serum creatinine and/or estimated creatinine clearance. Reduce or withhold dose as appropriate.

Cases of Tumor Lysis Syndrome (TLS), including fatal outcomes, have occurred in patients receiving KYPROLIS. Patients with multiple myeloma and a high tumor burden should be considered at greater risk for TLS. Adequate hydration is required prior to each dose in Cycle 1, and in subsequent cycles as needed. Consider uric acid lowering drugs in patients at risk for TLS. Monitor for evidence of TLS during treatment and manage promptly. Withhold KYPROLIS until TLS is resolved.

Acute Respiratory Distress Syndrome (ARDS), acute respiratory failure, and acute diffuse infiltrative pulmonary disease such as pneumonitis and interstitial lung disease have occurred in patients receiving KYPROLIS. Some events have been fatal. In the event of druginduced pulmonary toxicity, discontinue KYPROLIS.

Pulmonary arterial hypertension (PAH) was reported in patients treated with KYPROLIS. Evaluate with cardiac imaging and/or other tests as indicated. Withhold KYPROLIS for PAH until resolved or returned to baseline and consider whether to restart KYPROLIS based on a benefit/risk assessment.

Dyspnea was reported in patients treated with KYPROLIS. Evaluate dyspnea to exclude cardiopulmonary conditions including cardiac failure and pulmonary syndromes. Stop KYPROLIS for Grade 3 or 4 dyspnea until resolved or returned to baseline. Consider whether to restart KYPROLIS based on a benefit/risk assessment.

Hypertension, including hypertensive crisis and hypertensive emergency, has been observed with KYPROLIS. Some of these events have been fatal. It is recommended to control hypertension prior to starting KYPROLIS. Monitor blood pressure regularly in all patients. If hypertension cannot be adequately controlled, withhold KYPROLIS and evaluate. Consider whether to restart KYPROLIS based on a benefit/risk assessment.

Venous thromboembolic events (including deep venous thrombosis and pulmonary embolism) have been observed with KYPROLIS. Thromboprophylaxis is recommended for patients being treated with the combination of KYPROLIS with dexamethasone or with lenalidomide plus dexamethasone. The thromboprophylaxis regimen should be based on an assessment of the patients underlying risks.

Patients using oral contraceptives or a hormonal method of contraception associated with a risk of thrombosis should consider an alternative method of effective contraception during treatment with KYPROLIS in combination with dexamethasone or lenalidomide plus dexamethasone.

Infusion reactions, including life-threatening reactions, have occurred in patients receiving KYPROLIS. Symptoms include fever, chills, arthralgia, myalgia, facial flushing, facial edema, vomiting, weakness, shortness of breath, hypotension, syncope, chest tightness, or angina. These reactions can occur immediately following or up to 24 hours after administration of KYPROLIS. Premedicate with dexamethasone to reduce the incidence and severity of infusion reactions. Inform patients of the risk and of symptoms of an infusion reaction and to contact a physician immediately if they occur.

Fatal or serious cases of hemorrhage have been reported in patients receiving KYPROLIS. Hemorrhagic events have included gastrointestinal, pulmonary, and intracranial hemorrhage and epistaxis. Promptly evaluate signs and symptoms of blood loss. Reduce or withhold dose as appropriate.

KYPROLIS causes thrombocytopenia with recovery to baseline platelet count usually by the start of the next cycle. Thrombocytopenia was reported in patients receiving KYPROLIS. Monitor platelet counts frequently during treatment with KYPROLIS. Reduce or withhold dose as appropriate.

Hepatic Toxicity and Hepatic Failure

Cases of hepatic failure, including fatal cases, have been reported during treatment with KYPROLIS. KYPROLIS can cause increased serum transaminases. Monitor liver enzymes regularly regardless of baseline values. Reduce or withhold dose as appropriate.

Cases of thrombotic microangiopathy, including thrombotic thrombocytopenic purpura/hemolytic uremic syndrome (TTP/HUS), including fatal outcome have occurred in patients receiving KYPROLIS. Monitor for signs and symptoms of TTP/HUS. Discontinue KYPROLIS if diagnosis is suspected. If the diagnosis of TTP/HUS is excluded, KYPROLIS may be restarted. The safety of reinitiating KYPROLIS therapy in patients previously experiencing TTP/HUS is not known.

Posterior Reversible Encephalopathy Syndrome (PRES)

Cases of PRES have occurred in patients receiving KYPROLIS. PRES was formerly known as Reversible Posterior Leukoencephalopathy Syndrome. Consider a neuro-radiological imaging (MRI) for onset of visual or neurological symptoms. Discontinue KYPROLIS if PRES is suspected and evaluate. The safety of reinitiating KYPROLIS therapy in patients previously experiencing PRES is not known.

Increased Fatal and Serious Toxicities in Combination with Melphalan and Prednisone in Newly Diagnosed Transplant-ineligible Patients

In a clinical trial of transplant-ineligible patients with newly diagnosed multiple myeloma comparing KYPROLIS, melphalan, and prednisone (KMP) vs bortezomib, melphalan, and prednisone (VMP), a higher incidence of serious and fatal adverse events was observed in patients in the KMP arm. KYPROLIS in combination with melphalan and prednisone is not indicated for transplant-ineligible patients with newly diagnosed multiple myeloma.

KYPROLIS can cause fetal harm when administered to a pregnant woman based on its mechanism of action and findings in animals.

Females of reproductive potential should be advised to avoid becoming pregnant while being treated with KYPROLIS. Males of reproductive potential should be advised to avoid fathering a child while being treated with KYPROLIS. If this drug is used during pregnancy, or if pregnancy occurs while taking this drug, the patient should be apprised of the potential hazard to the fetus.

The most common adverse reactions occurring in at least 20% of patients treated with KYPROLIS in the combination therapy trials: anemia, neutropenia, diarrhea, dyspnea, fatigue, thrombocytopenia, pyrexia, insomnia, muscle spasm, cough, upper respiratory tract infection, hypokalemia.

The most common adverse reactions occurring in at least 20% of patients treated with KYPROLIS in monotherapy trials: anemia, fatigue, thrombocytopenia, nausea, pyrexia, dyspnea, diarrhea, headache, cough, edema peripheral.

Please see fullPrescribing Information.

(carfilzomib) is indicated in combination with dexamethasone or with lenalidomide plus dexamethasone for the treatment of patients with relapsed or refractory multiple myeloma who have received one to three lines of therapy.

is indicated as a single agent for the treatment of patients with relapsed or refractory multiple myeloma who have received one or more lines of therapy.

© 2018 Amgen Inc. All rights reserved. 04/18

This website is intended for US healthcare professionals only.

Please be aware that the sponsors of this site are not responsible for content on the site you are about to enter.

Prior probability

From Wikipedia, the free encyclopedia

Not to be confused witha priori probability.

This article includes alist of references, related reading orexternal links,

but its sources remain unclear because it lacksinline citations

Please help toimprovethis article byintroducingmore precise citations.

(Learn how and when to remove this template message)

In, aprior probability distribution, often simply called theprior, of an uncertain quantity is theprobability distributionthat would express ones beliefs about this quantity before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be aparameterof the model or alatent variablerather than anobservable variable.

Bayes theoremcalculates the renormalized pointwise product of the prior and thelikelihood function, to produce theposterior probability distribution, which is the conditional distribution of the uncertain quantity given the data.

Similarly, theprior probabilityof arandom eventor an uncertain proposition is theunconditional probabilitythat is assigned before any relevant evidence is taken into account.

Priors can be created using a number of methods.1(pp2741)A prior can be determined from past information, such as previous experiments. A prior can beelicitedfrom the purely subjective assessment of an experienced expert. Anuninformative priorcan be created to reflect a balance among outcomes when no information is available. Priors can also be chosen according to some principle, such as symmetry or maximizing entropy given constraints; examples are theJeffreys prioror Bernardos reference prior. When a family ofconjugate priorsexists, choosing a prior from that family simplifies calculation of the posterior distribution.

Parameters of prior distributions are a kind ofhyperparameter. For example, if one uses abeta distributionto model the distribution of the parameterpof aBernoulli distribution, then:

Hyperparameters themselves may havehyperpriordistributions expressing beliefs about their values. A Bayesian model with more than one level of prior like this is called ahierarchical Bayes model.

Aninformative priorexpresses specific, definite information about a variable. An example is a prior distribution for the temperature at noon tomorrow. A reasonable approach is to make the prior anormal distributionwithexpected valueequal to todays noontime temperature, withvarianceequal to the day-to-day variance of atmospheric temperature, or a distribution of the temperature for that day of the year.

This example has a property in common with many priors, namely, that the posterior from one problem (todays temperature) becomes the prior for another problem (tomorrows temperature); pre-existing evidence which has already been taken into account is part of the prior and, as more evidence accumulates, the posterior is determined largely by the evidence rather than any original assumption, provided that the original assumption admitted the possibility of what the evidence is suggesting. The terms prior and posterior are generally relative to a specific datum or observation.

Aweakly informative priorexpresses partial information about a variable. An example is, when setting the prior distribution for the temperature at noon tomorrow in St. Louis, to use a normal distribution with mean 50 degrees Fahrenheit and standard deviation 40 degrees, which very loosely constrains the temperature to the range (10 degrees, 90 degrees) with a small chance of being below -30 degrees or above 130 degrees. The purpose of a weakly informative prior is forregularization, that is, to keep inferences in a reasonable range.

Anuninformative priorordiffuse priorexpresses vague or general information about a variable. The term uninformative prior is somewhat of a misnomer. Such a prior might also be called anot very informative prior, or anobjective prior, i.e. one thats not subjectively elicited.

Uninformative priors can express objective information such as the variable is positive or the variable is less than some limit. The simplest and oldest rule for determining a non-informative prior is theprinciple of indifference, which assigns equal probabilities to all possibilities. In parameter estimation problems, the use of an uninformative prior typically yields results which are not too different from conventional statistical analysis, as the likelihood function often yields more information than the uninformative prior.

Some attempts have been made at findinga priori probabilities, i.e. probability distributions in some sense logically required by the nature of ones state of uncertainty; these are a subject of philosophical controversy, with Bayesians being roughly divided into two schools: objective Bayesians, who believe such priors exist in many useful situations, and subjective Bayesians who believe that in practice priors usually represent subjective judgements of opinion that cannot be rigorously justified (Williamson 2010). Perhaps the strongest arguments for objective Bayesianism were given byEdwin T. Jaynes, based mainly on the consequences of symmetries and on the principle of maximum entropy.

As an example of an a priori prior, due to Jaynes (2003), consider a situation in which one knows a ball has been hidden under one of three cups, A, B or C, but no other information is available about its location. In this case auniform priorofp(A)=p(B)=p(C)= 1/3 seems intuitively like the only reasonable choice. More formally, we can see that the problem remains the same if we swap around the labels (A, B and C) of the cups. It would therefore be odd to choose a prior for which a permutation of the labels would cause a change in our predictions about which cup the ball will be found under; the uniform prior is the only one which preserves this invariance. If one accepts this invariance principle then one can see that the uniform prior is the logically correct prior to represent this state of knowledge. This prior is objective in the sense of being the correct choice to represent a particular state of knowledge, but it is not objective in the sense of being an observer-independent feature of the world: in reality the ball exists under a particular cup, and it only makes sense to speak of probabilities in this situation if there is an observer with limited knowledge about the system.

Priors can be constructed which are proportional to theHaar measureif the parameter spaceXcarries anatural group structurewhich leaves invariant our Bayesian state of knowledge (Jaynes, 1968). This can be seen as a generalisation of the invariance principle used to justify the uniform prior over the three cups in the example above. For example, in physics we might expect that an experiment will give the same results regardless of our choice of the origin of a coordinate system. This induces the group structure of thetranslation grouponX, which determines the prior probability as a constantimproper prior. Similarly, some measurements are naturally invariant to the choice of an arbitrary scale (e.g., whether centimeters or inches are used, the physical results should be equal). In such a case, the scale group is the natural group structure, and the corresponding prior onXis proportional to 1/x. It sometimes matters whether we use the left-invariant or right-invariant Haar measure. For example, the left and right invariant Haar measures on theaffine groupare not equal. Berger (1985, p.413) argues that the right-invariant Haar measure is the correct choice.

Another idea, championed byEdwin T. Jaynes, is to use theprinciple of maximum entropy(MAXENT). The motivation is that theShannon entropyof a probability distribution measures the amount of information contained in the distribution. The larger the entropy, the less information is provided by the distribution. Thus, by maximizing the entropy over a suitable set of probability distributions onX, one finds the distribution that is least informative in the sense that it contains the least amount of information consistent with the constraints that define the set. For example, the maximum entropy prior on a discrete space, given only that the probability is normalized to 1, is the prior that assigns equal probability to each state. And in the continuous case, the maximum entropy prior given that the density is normalized with mean zero and variance unity is the standardnormal distribution. The principle ofminimum cross-entropygeneralizes MAXENT to the case of updating an arbitrary prior distribution with suitable constraints in the maximum-entropy sense.

A related idea,reference priors, was introduced byJos-Miguel Bernardo. Here, the idea is to maximize the expectedKullbackLeibler divergenceof the posterior distribution relative to the prior. This maximizes the expected posterior information aboutXwhen the prior density isp(x); thus, in some sense,p(x) is the least informative prior about X. The reference prior is defined in the asymptotic limit, i.e., one considers the limit of the priors so obtained as the number of data points goes to infinity. In the present case, the KL divergence between the prior and posterior distributions is given by

This is a quasi-KL divergence (quasi in the sense that the square root of the Fisher information may be the kernel of an improper distribution). Due to the minus sign, we need to minimise this in order to maximise the KL divergence with which we started. The minimum value of the last equation occurs where the two distributions in the logarithm argument, improper or not, do not diverge. This in turn occurs when the prior distribution is proportional to the square root of the Fisher information of the likelihood function. Hence in the single parameter case, reference priors and Jeffreys priors are identical, even though Jeffreys has a very different rationale.

Reference priors are often the objective prior of choice in multivariate problems, since other rules (e.g.,Jeffreys rule) may result in priors with problematic behavior.clarification neededA Jeffreys prior is related to KL divergence?

Objective prior distributions may also be derived from other principles, such asinformationorcoding theory(see e.g.minimum description length) orfrequentist statistics(seefrequentist matching). Such methods are used inSolomonoffs theory of inductive inference

Philosophical problems associated with uninformative priors are associated with the choice of an appropriate metric, or measurement scale. Suppose we want a prior for the running speed of a runner who is unknown to us. We could specify, say, a normal distribution as the prior for his speed, but alternatively we could specify a normal prior for the time he takes to complete 100 metres, which is proportional to the reciprocal of the first prior. These are very different priors, but it is not clear which is to be preferred. Jaynes often-overlookedmethod of transformation groupscan answer this question in some situations.3

Similarly, if asked to estimate an unknown proportion between 0 and 1, we might say that all proportions are equally likely, and use a uniform prior. Alternatively, we might say that all orders of magnitude for the proportion are equally likely, thelogarithmic prior, which is the uniform prior on the logarithm of proportion. TheJeffreys priorattempts to solve this problem by computing a prior which expresses the same belief no matter which metric is used. The Jeffreys prior for an unknown proportionpisp1/2(1p)1/2, which differs from Jaynes recommendation.

Priors based on notions ofalgorithmic probabilityare used ininductive inferenceas a basis for induction in very general settings.

Practical problems associated with uninformative priors include the requirement that the posterior distribution be proper. The usual uninformative priors on continuous, unbounded variables are improper. This need not be a problem if the posterior distribution is proper. Another issue of importance is that if an uninformative prior is to be usedroutinely, i.e., with many different data sets, it should have goodfrequentistproperties. Normally aBayesianwould not be concerned with such issues, but it can be important in this situation. For example, one would want anydecision rulebased on the posterior distribution to beadmissibleunder the adopted loss function. Unfortunately, admissibility is often difficult to check, although some results are known (e.g., Berger and Strawderman 1996). The issue is particularly acute withhierarchical Bayes models; the usual priors (e.g., Jeffreys prior) may give badly inadmissible decision rules if employed at the higher levels of the hierarchy.

then it is clear that the same result would be obtained if all the prior probabilitiesP(Ai) andP(Aj) were multiplied by a given constant; the same would be true for acontinuous random variable. If the summation in the denominator converges, the posterior probabilities will still sum (or integrate) to 1 even if the prior values do not, and so the priors may only need to be specified in the correct proportion. Taking this idea further, in many cases the sum or integral of the prior values may not even need to be finite to get sensible answers for the posterior probabilities. When this is the case, the prior is called animproper prior. However, the posterior distribution need not be a proper distribution if the prior is improper. This is clear from the case where eventBis independent of all of theAj.

Statisticians sometimescitation needed4use improper priors asuninformative priors. For example, if they need a prior distribution for the mean and variance of a random variable, they may assumep(m,v)~1/v(forv0) which would suggest that any value for the mean is equally likely and that a value for the positive variance becomes less likely in inverse proportion to its value. Many authors (Lindley, 1973; De Groot, 1937; Kass and Wasserman, 1996)citation neededwarn against the danger of over-interpreting those priors since they are not probability densities. The only relevance they have is found in the corresponding posterior, as long as it is well-defined for all observations. (TheHaldane prioris a typical counterexample.clarification neededcitation needed)

Examples of improper priors include:

Beta(0,0), thebeta distributionfor =0, =0.

Theuniform distributionon an infinite interval (i.e., a half-line or the entire real line).

The logarithmic prior on thepositive reals.

Carlin, Bradley P.; Louis, Thomas A. (2008).

(Third ed.). CRC Press.ISBN83.

This prior was proposed byJ.B.S. Haldanein A note on inverse probability, Mathematical Proceedings of the Cambridge Philosophical Society 28, 5561, 1932,doi10.1017/S5. See also J. Haldane, The precision of observed values of small frequencies, Biometrika, 35:297300, 1948,doi10.2307/2332350JSTOR2332350.

Jaynes (1968), pp. 17, see also Jaynes (2003), chapter 12. Note that chapter 12 is not available in the online preprint but can be previewed via Google Books.

Christensen, Ronald; Johnson, Wesley; Branscum, Adam; Hanson, Timothy E. (2010).

Bayesian Ideas and Data Analysis: An Introduction for Scientists and Statisticians

. Hoboken: CRC Press. p.69.ISBN98.

Rubin, Donald B.;Gelman, Andrew; John B. Carlin; Stern, Hal (2003).

(2nd ed.). Boca Raton: Chapman & Hall/CRC.ISBN1-58488-388-XMR2027492.

CS1 maint: Multiple names: authors list (link)

Statistical decision theory and Bayesian analysis

. Berlin: Springer-Verlag.ISBN0-387-96098-8MR0804611.

Berger, James O.; Strawderman, William E. (1996). Choice of hierarchical priors: admissibility in estimation of normal means.

(3): 931951.doi10.1214/aos/1032526950MR1401831Zbl0865.62004.

Bernardo, Jose M. (1979). Reference Posterior Distributions for Bayesian Inference.

Journal of the Royal Statistical Society, Series B

(2): 113147.JSTOR2985028MR0547240.

James O. BergerJos M. Bernardo; Dongchu Sun (2009). The formal definition of reference priors.

Jaynes, Edwin T.(Sep 1968).Prior Probabilities

IEEE Transactions on Systems Science and Cybernetics

(3): 227241.doi10.1109/TSSC.1968.300117

E. T. Jaynes: papers on probability, statistics, and statistical physics

. Boston: Kluwer Academic Publishers. pp.116130.ISBN90-277-1448-7.

Probability Theory: The Logic of Science

. Cambridge University Press.ISBN0-521-59271-2.

Williamson, Jon (2010).review of Bruno di Finetti. Philosophical Lectures on Probability

(1): 130135.doi10.1093/philmat/nkp019. Archived fromthe original

Articles lacking in-text citations from February 2008

All articles lacking in-text citations

Wikipedia articles needing clarification from August 2015

Wikipedia articles needing clarification from September 2015

All articles with unsourced statements

Articles with unsourced statements from December 2008

Wikipedia articles needing clarification from May 2011

Articles with unsourced statements from May 2011

Articles with unsourced statements from October 2010

CS1 maint: Multiple names: authors list

This page was last edited on 25 June 2018, at 19:36

Text is available under the; additional terms may apply. By using this site, you agree to theTerms of UseandPrivacy Policy. Wikipedia® is a registered trademark of theWikimedia Foundation, Inc., a non-profit organization.

Year to Date in PreviousPrior Year

YTD is easy with standard calendars, but the previous years equivalent is a non-obvious formula.

Whats this, you say?? Yeah, its Rob, and Im here with an actual formula post!Woo woo! Starting the year off right, as they say.

Weve got a lot going on early this year, so expect to see a number of announcements and remindersin the coming weeks (one of which later THIS week, and its super exciting for me at least).

So with that in mind, were ALSO renewing our commitment to meaty posts the stuff that brings you here in the first place.

Hey, this is a pretty common question, and a pretty common need.But theres no DAX function that just DOES this.

For instance, getting a Year to Date calculation for, say, Total Sales, is pretty straightforward:

CALCULATE([Total Sales], DATESYTD(Calendar[Date]))

Very straightforward, and it gives us this:

Even the magical DATESYTD function comes with pre-requisites:

1) Your business must run on what we call a Standard calendar.Whats a standard calendar, you ask? Well, the key question here is in your business, what qualifies as a month? For instance, if you want to compare April of this year to April of last year, what defines April? Can you just look at a regular wall calendar as a reference for what defines April? If so, you have a Standard calendar even if your year ends in June rather than December. But if not, and your months arent really months at all, but you have periods instead, well, you have a Custom Calendar.

DATESYTD willnothelp you if you have a Custom calendar, and neither will this blog post,sadly, because for the moment, I only have time to cover the standard calendar version.

(For more on custom calendars in the meantime, try our newly-released2nd edition book, which has been updated quite a bit since the best-selling 1st edition).

2) You must have a well-constructed Calendar/Dates table.And what qualifies as well-constructed? Turns out this is covered quite succinctly in thereference card, which you can download for free, so I wont go into it here.

2a) OK, and you ALSO need to then USE that Calendar/Dates table on your pivot.Theres agreat poston this topic, so make sure you glance over it if you arent familiar with this concept.

Back to the point of this post:it turns out that you can nest a DATESYTD inside of a DATEADD!

CALCULATE([Total Sales], DATEADD(DATESYTD(Calendar[Date]),-1,Year))

Bam! Thats What Were Talking About Willis!

Basically, what happens in this formula is that the DATESYTD runs first,and finds the dates in the calendar starting from the beginning of the year up through the max date in the current filter context of the pivot. Then that range of dates is handed to DATEADD, which then shifts that range of dates back by one year.

A simple formula once you know you can nest these two.

Lets say youre halfway through a given month for example, pretend todays date is July 15, 2016, and I want to know how were doing, year to date, versus the same time period of 2015.

In that case, I dont want ALL of July 2015 includedin the comparison! That would be unfair to 2016, because Id have less time in 2016 than the comparison period in 2015.

But if my Calendar table contains all of the dates for July 2016 it goes slightly into the future and already has rows for July 16, 17, and so on well guess what? The formula above WILL include the entirety of July 2015 in the comparison.

Ive never liked this about the built-in date intelligence functions, and this is why I always prefer Calendar tables to be trimmed meaning, the latest date included in the Calendar table matches the latest date for which I have real data (like Sales transactions for instance). A trimmed calendar avoids this problem, and limits my 2015 comparison period to properly match my partial month in 2016.

But for some reason, every other DAX pro Ive talked to prefers Calendar tables to run through the end of the current year.I dont quite get why they prefer it that way, but hey, I know when Im outnumbered that I should at least entertain the possibility that I am wrong. So if you follow the advice of others, you will not have trimmed calendars (which are admittedly tricky to pull off, since they need to update every day as you get new data), and you will need another fix.

Never fear, in this case you just nest a third function:

FILTER(DATESYTD(Calendar[Date]),[Total Sales]0),

Basically, all we did was add an intermediate step, via the FILTER function.

Find the range of dates YTD according to your Calendar table (which may include dates for which you dont yet have data).

Then FILTER those dates to exclude dates for which you do NOT have data this trims those future dates, like July 16 in our example.

NOW shift those remaining dates back one year.

Of course, even THIS has problems,because if you have legitimate historical dates for which there is no data, the FILTER will remove those too, and then your DATEADD will blow up on you.

Geez, wouldnt it be better to just have a trimmed calendar, folks?

LASTNONBLANK(Calendar[Date], [Total Sales])

One of the founding engineers behind Power Pivot during his 14-year career at Microsoft, and creator of the worlds first cloud Power Pivot service, Rob is one of the foremost authorities on self-service business intelligence and next-generation spreadsheet technology.

I was just working on something similar today and used the TOTALYTD time intelligence function to get the period YTD eg for a company with year end 31/03/yy then that would be :

[Fiscal YTD Sales] := TOTALYTD ( [Sales Amount], Date[Date], 31/03)

The 31/03 is because my local language settings are UK. YOu should amend this to your own settings.

Without using DAX built in time intelligence functions this could have also been done for the current period using as your post DATESYTD like this

[Fiscal YTD Sales] := CALCULATE ( [Sales Amount], DATESYTD ( Date[Date], 31/03) )

Then with this base measure [Fiscal YTD Sales] I then used the function SAMEPERIODLASTYEAR

[PY Fiscal YTD Sales] := CALCULATE ([Fiscal YTD Sales], SAMEPERIODASTYEAR ( Date[Date] ) )

Is this correct and if so how and why your version in your blog ?

Ps for beginners link the date in your Sales Fact table to a full Date dimension table to the field [Date] which should be a sequential order of all the daily dates within your dataset range.

Hi David, see my reply to Cwayne below. TOTALYTD and DATESYTD are both time intel functions, and I believe them to be identical under the hood. (TOTALYTD still ends up using a CALCULATE behind the scenes).

I also personally dislike using a CALCULATE function on top of another, because the impact on filter context becomes murkier. I prefer to do the Last Year YTD measure all at once, rather than piling CALCULATE on CALCULATE. Stylistic perhaps. If your approach is working for you, I encourage you to continue 🙂

Rob, you mentioned not liking a CALCULATE inside a CALCULATE. I have to admit I actually do that, mainly when one measure is a natural extension of another.

E.g. if I already have a [Bike Sales] measure that uses CALCULATE, and someone comes along and asks me for a Green Bike Sales measure, I would typically do CALCULATE([Bikes Sales], FILTER([Color]=Green) rather than doing CALCULATE([Sales], FILTER([Product]=Bike && [Color]=Green).

Obviously a simple example, but there definitely seems to be a pro/con trade-off here. By wrapping my green bike sales around [Bike Sales], I ensure that any changes to the definition of [Bike Sales] propagates down to derived formulas (portable formulas!). But on the flip side, I am layering CALCULATE inside CALCULATE in a way I would never do in a single formula. I havent run into performance issues, but your comment has me thinking about what the best practise here might be (if indeed there is one).

Leonard, those cases dont trigger my sense of unease. Column=Value filters, the kind I call simple filters the interaction of piling those on top of each other is pretty clear to me. I would do exactly the same thing as you describe, in the case of Green Bike Sales.

What I *dont* like to do, is perform advanced manipulation of filter context in multi-layered fashion. A DATESYTD measure nested inside a SAMEPERIODLASTYEAR measure for instance. Even that, though, is likely just a personal quirk of mine rather than a best practice rooted in good reasons.

Thank you for the awesome post. Question: Why refrain from using TOTALYTD & SAMEPERIODLASTYEAR functions?

The thing about the time intel functions is that many of them are just re-skinned versions of the others. TOTALYTD(measure,) , as far as I can tell, is exactly the same as CALCULATE(measure, DATESYTD())

To keep things simple for myself and students, I tend to stick with the more primitive versions of the functions, but thats a stylistic choice as much as anything.

And sometimes these alternate functions contain interesting, quirky, extra logic that I cant clearly see because its hiding behind the scenes. I think PARALLELPERIOD was one of them, SAMEPERIODLASTYEAR is probably more straightforward but again I havent personally used it in a long time. Stylistic choice, again, and I see nothing wrong with using the other functions. Primitives are just more my style.

Rob, in the [Prev Yr YTD Sales Trimmed] there is an issue with the use of LASTNONBLANK. Such a function is a table function that performs a context transition, so calling it within an iterator (the FILTER) evaluates only one date at a time. In practice, by writing

LASTNONBLANK ( Calendar[Date], [Total Sales] )

you are getting the same result as if you write:

IF ( NOT ISBLANK ( [Total Sales] ), Calendar[Date] )

Maybe you get the same result as you want in this way, but if you have some date with a blank value in the middle of two days with some value, the DATEADD function will not work (because it does not support non-contiguous selections)

Thanks Marco, this is interesting, and always glad to see your name in the comments feed! A few quick comments/thoughts:

1) Yeah, serves me right for leaving the world of trimmed calendars and trying to do things the other way, all at the end of a post that I planned to end long before I dragged myself into all of the special cases.

2) That said I did test the formula and it did seem to be working properly, I will try that again when I get home.

3) Are you sure you guys dont want to partner up in some way where you guys are the Graduate level and Im the Intro level? 🙂

Let me know if you find the issue I tried the formula and it was having the issue I described for the rest, Alberto will be at MS Data Insight Summit and I will be at PASS BA Conference well have time to talk 🙂

You can calculate the last relevant date using a variable to prevent the non-contiguos date range.

var LastRelevantDate = LASTNONBLANK ( Calendar[Date], [Total Sales] )

FILTER ( DATESYTD ( Calendar[Date] ),

Calendar[Date] = LastRelevantDate

I tried Robs formulas and LYMTD and LYYTD work fine, but when I update the report my LYQTD brings the error you mentioned:

The query did not run or the Data Model could not be accessed. Heres the error message we got:

MdxScript(Model) (8,5) Calculation error in measure Data[LY QTD Sales]: Function DATEADD only works with contiguous date selection

2) Write your own DAX without using Time Intelligence functions (for example

3) Use Power BI (or maybe a recent version of Excel 2016 shouldnt have that issue)

First of all, thank you for your post as it really helped me out when trying to deal with my report.

LYMTD and LYYTD work fine, but when I update the report my LYQTD brings the error Marco might be talking about:

The query did not run or the Data Model could not be accessed. Heres the error message we got:

MdxScript(Model) (8,5) Calculation error in measure Data[LY QTD Sales]: Function DATEADD only works with contiguous date selection

Rob, the subject was just what I was looking for but using your solution I did get the non-contiguous error. Do you have a different solution I can try?

While i have not tried this, the following should work:

SAMEPERIODLASTYEAR(DATESBETWEEN(Calendar[date], STARTOFYEAR(Calendar[Date] ),

LASTNONBLANK(Calendar[Date], [Total Sales])

You may want to enter some error testing/handling to be sure LASTNONBLANK returns a value because if it returns blank DATESBETWEEN will extend the dates returned to the last date in your calendar table. Probably not what you want.

Also be sure to Mark as Date Table your calendar table so the time intelligence functions work correctly.

Personally i would never use a trimmed calendar especially if i am using the time intelligence functions. They work by shifting the calendar[date] column values visible in the filter context not by performing math. So with a trimmed calendar you would need to be careful not to shift into the non-existent future or past. Since they do shift dates in filter context you need to include all the dates for the prior year in your calendar table. I also think getting used to and using the specialized time intelligence functions improves code readability and therefore maintenance even though they are mostly sugar syntax built on variations of the date function primitives. Having said that I agree though that it is a personal style preference.

Hi Matthew, I seem to have gotten your formula working seems to be just what I needed. Thank you.

Im far from an expert, but days in my calendar table were making the formula work strangely, and I believe it had to do with the LASTNONBLANK part you discussed above.

I added the IF ISBLANK portion as per below, and it seems to have eliminated the data appearing for a previous year that shouldnt have. Seems to be working as intended for me right now.

Prev Yr YTD Sales Trimmed:=CALCULATE([Total Sales],

DATESBETWEEN(dCalendar[date],STARTOFYEAR(dCalendar[Date]),

IF(ISBLANK(LASTNONBLANK(dCalendar[Date],[Total Sales])),STARTOFYEAR(dCalendar[Date]),LASTNONBLANK(dCalendar[Date],[Total Sales])))))

How can we calculate YOY% change if we dont have a date field? I have a year and month field as my only time fields in a given data set. Can we calculate YOY change with Power Pivot?

What if, historically, you only have transactions on a Saturday in November and December but not throughout the rest of the year. Will those blank dates throw this off?

I was actually working on the same issue last week and ended going with something that looked like this:

SAMEPERIODLASTYEAR ( FIRSTDATE ( Dates[Date] ) ),

SAMEPERIODLASTYEAR ( LASTDATE ( Dates[Date] ) )

Not sure if this is efficient or the best way to do it, but it worked for me.

Syntactically it will work but logically it doesnt solve the point Rob is bringing up. The code above would include the last date of the current filter context which unless you are using one of Robs trimmed calendars will always be the last day of the month. Rob wants more apple to apple granularity meaning the previous year comparison is for the same exact number of days as the current ytd total (mid period).

However the validity of Robs comparison for periods prior to the current one is debatable. The formula he wrote and the one i posted above are logically questionable and probably not what would be desired by management.

the datesbetween combined with sameperiodlastyear is the clearest for me and works for non standard calender years like a fiscal year. I just adopted it for a full previous years (12months) sales replacing firstdate & lastdate with startofyear & endofyear respectively.

Maybe not the most efficient for the DAX engine but good.

Just i had problems with ParallelPeriod which appears to only work on calender years and not fiscal years. Has anyone managed to use parallelperiod for non standard calender years ?

Cool dive into the workings of DAX. Conditional formatting on the result definitely added a refreshing pizazz to the presentation!

Rob, the [Last YTD/MTD Sales Trimmed] formula is EXACTLY what Im looking for right now but when I try your formula Im getting the dreaded DATEADD only works with contiguous date selections error. Any ideas what causing the error?

You need a separate date table. In youre in Excel 365 or 2016, try =Calendarauto()

Great info very helpful. I created a budget vs actual report using Power Pivot. I manually imported the budget data and through a series of relationships i can view both the actual and budget by month, department and account. The issue arises when i try to show the variance between the actual and the budget. If i manually create a Pivot Table, I can use the calculated field function which will be budget minus actual and i get the variance.

I am struggling here so any help would be greatly appreciated.

What happens if last year was leap year? Will the extra day in February just work or be left out? The official reference says DATEADD Returns a table that contains a column of dates, shifted either forward or backward in time by the specified number of intervals from the dates in the current context. So there is no Feb 29 in the current context

How do we use Quarter to Date. Previous quarter to date and Previous year same quarter to date?

I am using this for my YTD calculations. Anything that i could modify to get the quarter values.

DATESBETWEEN(Dates[Dates],DATEADD(LASTDATE(Year_Period[Next_Month_Start_Date]),MAX(Year_Period[Fiscal_Period])*-1,MONTH),LASTDATE(Year_Period[Month_End_Date])

Why not use time intelligence functions(?) Much easier to read and maintain

QTD := TOTALQTD ( [Total Sales], Calendar[Date] )

Previous QTD := TOTALQTD ( [Total Sales], PREVIOUSQUARTER(Calendar[Date] ) )

Previous Year Same Quarter := TOTALQTD ( [Total Sales], SAMEPERIODLASTYEAR( Calendar[Date] ) )

Ive been banging on this exact question for a couple of days wondering what type of convoluted logic I needed (and creating some horrific Frankendax experiments in the process.)

Thank you for surfacing the problems and solutions with such clear and helpful explanations.

Personally Im a big fan of all the Custom Calendar and calculated columns tricks on Custom Calendars. A little more setup on the backside, but then writing the Previous YTD Sales or any other similar function is a snap.

My Custom Calendar Table includes a column for the number of the Day in the year (Day of Year) (e.g. today May 8th is the 129th Day of 2016). First I created a measure that returns the Day of Year for the latest sale of the current year,

Latest Day of This Year:=CALCULATE(LASTNONBLANK(Calendar[Day of Year], [Total Sales]), Calendar[YEAR]=YEAR(TODAY()))

then a calculated column on my calendar table Is Prev YTD Date =IF(YEAR(TODAY())-1=[YEAR]&&[Day of Year]=[Latest Day of This Year], TRUE, FALSE)

Then my Previous YTD Sales measure is simply = CALCULATE([Total Sales], Calendar[Is Prev YTD Date])

(I used to include an = TRUE, as in Calendar[Is Prev YTD Date]=TRUE , but found I didnt even need to include that since the column itself returns only TRUE or FALSE values)

(My fiscal year starts Jan 1, but this could be easily modified to account for a Fiscal Year start date other than Jan 1)

What do you think of this approach? My understanding is that using calculated columns on Lookup Tables (Like the Calendar table) isnt significantly detrimental to performance or size if the cardinality on the calculated field is low. Am I wrong about this? I used to wrestle (and usually lose!) with all the time intelligence functions until I started this approach and now I have tons of calculated columns on my Calendar Table for all sorts of custom time periods that I need and then my measures are simply = CALCULATE([Total Measure], Calendar[Custom Time Window]).

I realized that this approach works as long as date fields arent on my axes. The rows on the table in my report is my list of customers, and some of them dont have any YTD Sales, so I was running into the problems mentioned above with non-contiguous dates. I use Todays Day of Year instead of Latest Day of This Year using CALCULATE(MAX(Calendar[Day of Year]), Calendar[Date]=TODAY()) when I need the report to include dates on rows. But perhaps it could still work if a FILTER(ALL. or some such approach was used in the Latest Day of This Year Measure? I personally havent had a need for this but if I did thats where Id start playing around.

Interesting, but if you already have a Calendar table with all contiguous dates for the range you need to report on(this year and prior and future dates), not sure how that is easier than:

YTD Total Sales := TOTALYTD( [Total Sales], CalendarDate] ) or equivalent: CALCULATE ( [Total Sales], DATESYTD( Calendar[Date] ) )

YTD Total Sales Prior Year:= TOTALYTD( [Total Sales], SAMEPERIODLASTYEAR (Calendar[Date] ) ) or equivalent: CALCULATE ( [Total Sales], SAMEPERIODLASTYEAR( DATESYTD( Calendar[Date] ) ) )

I think putting any custom date shifting you want to do is best put in the function instead of a table. But a personal preference perhaps.

I have probelm with balance sheet. I have some accounts on 31/12/2012. Now, I have to find formula to give me amount for balance sheet (BS) on day 31.05. 2015. I have to sum all numbers accounts from 31/12/2012 until 31.05.2015. Which formula is the best to achieve my request.

I have a little different requirement. I work in a long running project environment say 3-4 years long projects. We monitor completion till date, t0 being in the earlier years. I need to calculate Completion till date. Is there a way I can do this.

Hello 🙂 Do you have the excel files for these examples by any chance? Thanks.

If you are a brave soul, the measure formulas can be applied in the workbook ch14_TimeIntelligence.xlsx available in the downloads for the book, Power Pivot and Power BI. If nobody posts more, at least this can get you started.

Thats a great post, Rob! Exactly what I was looking for to have my Same Date Last Year figure. However, when I drop the Measure in my Pivot Table, I get an error message: FUNCTION DATEADD only works with contiguous date selections

[SUM RN], DATEADD(FILTER(DATESYTD(DateDimension[Date]),

DateDimension[Date]= LASTNONBLANK(DateDimension[Date],

There is a PowerPivotPro forum where questions like yours can get answers:

Hey Rob, I have tried your way of getting Prior Year to Date until a specific date in a month (August 9 2015), but it works partially. In this case Im comparing this year YTD sales (August 9 2016) and Prior YTD sales (August 9 2015) day by day and it works well, its just that at the last row (August 9), it shows me the sales for the whole August 2015, not the sales until August 9 2015 as expected. I modified your function at the first part cause if I select [Total Sales] measure, I dont just get the data incrementally until August 9 but also I get a total value for 2015 and 2016 years.

Prev Yr YTD Sales Trimmed = CALCULATE(Sum(Sales[Sales]),Calendar[Date],DATEADD(FILTER(DATESYTD(Calendar[Date]),Calendar[Date]=LASTNONBLANK(Calendar[Date],Sales[Total Sales])),-1,Year))

Why not simply identify the current date, identify the first date of the current year, do the same for the prior year and then use the DatesBetween function to define each YTD period accordingly?

I hit a wall with one particular calculation of YoY growth rates (GR).

I have a measure, which calculates GRs and I want to figure out:

how to add a slicer with ranges of GR: 10% to quickly filter only outperforming months?

what if I have a list of 100 products sold in each month and I would like to have a column with a number of products, for which YOY GR is in the range of the slicer? say for Jun16 25 products had YOY GR of 5-10%. having this number of 25 I could double click on it to see the exact items useful.

Would appreciate if you could share your thoughts.

It seems Im encountering in the issue you mentioned where my YTD comparisons are summing to the end of the period/month when the current date is mid-month. I created my date table using CALENDARAUTO() in this report. It seems I should have used a Power Query date table so that the first and last dates in my date table match the respective dates in my fact table. Is there a way I can trim my current date table without creating a new date table from scratch?

Your article helped me solve a calculation I was having trouble with.

I just found out I still have an issue come 2/1/2017. For this paticular report we have a rolling 13 months of data. So on Feb1 2017 when this report runs I will have an issue. I am going to have an issue with the Rev Adj % calc in my pivot table when I select any date for 2016 on the date slicer. How can I create a calculation that is =divide([Month chosen in slicer]-[Pri Year same month as selected in slicer], [Pri Year same month as selected in slicer].

The Rev Adj % calc below only works in my pivot table if I select a month with the current year on the date slicer.

Revenue YTD:=CALCULATE([Total Revenue],DATESYTD(D_EDMDATE[CAL_MTH_NM]))

Revenue Prior Year:=CALCULATE([Revenue YTD],SAMEPERIODLASTYEAR(D_EDMDATE[CAL_MTH_NM]))

Rev Adj %:=DIVIDE([Revenue YTD]-[Revenue Prior Year],[Revenue Prior Year])

I had a brain freeze on the comment above. Please dont spend time on it. It actually does work I dont know what I was thinking. You can delete it. I think I had changed the date where clause and that is why it wasnt working. I fixed it and I can now slice 12/1/2015 and it works fine.

Rob, thank you for this, I was looking for a solution for long hours until I found your post and simply put you are the best! 🙂 Simple formula which does exactly what I need, brilliant.

Since my YTD periods always commence with the first of the year, I added a true/false column to my date table calculated as InYTD,IF(Format([Date],MMDD)=Format([FactTable AsOf Date],MMDD),TRUE,FALSE). The AsOf date is the max date of the input fact table used. That lets me add one more quick selection filter of Dates[InYTD] = True/false when building other measures, and only gets calculated on refresh of tables.

Thank you for your reply! Taking into account that I have the complete Calendar table and the latest Excel 2016 build (and thats why that is very strange), I think I have only one option: the second from your list.

Foundations: PowerPivot and Power BI

Level Up: Power Query for Excel and Power BI

Foundations: Power Pivot and Power BI

Microsoft Corporate Sales Office Indianapolis

Microsoft Corporate Sales Office Indianapolis

Level Up: Power Query for Excel and Power BI

Microsoft Corporate Sales Office Indianapolis