Youtube results:
Parliament of the United Kingdom |
|
Long title | An Act to prohibit the opening of large shops on Christmas Day and to restrict the loading or unloading of goods at such shops on Christmas Day. |
---|---|
Statute book chapter | 2004 c 26 |
Territorial extent | England and Wales[2] |
Dates | |
Royal Assent | 28 October 2004 |
Commencement | 9 December 2004[3] |
Status: | |
Text of statute as originally enacted | |
Revised text of statute as amended |
The Christmas Day (Trading) Act 2004 (c 26) is an Act of the Parliament of the United Kingdom. It prevents shops over 280 m²/3,000 sq ft from opening on Christmas Day in England and Wales. Shops smaller than the limit are not affected.
The Act was introduced to the House of Commons by Kevan Jones, MP for North Durham as a Private Member's Bill on 7 January 2004.
The aim of the Act was to keep Christmas Day a "special" day, whereby all major retailers would be closed. Although it was traditional for major retailers to close on 25 December, some retailers, such as Woolworths, began to open some stores in the late 1990s. Both religious groups and shop worker unions were against the idea of Christmas openings, leading to pressure on the Government to pass legislation to prevent the practice.
In 2006, the Scottish Parliament debated a similar law that would apply to shops in Scotland. A key difference was the proposal for the Scottish law to apply on New Year's Day too.[clarification needed]
|
Christmas | |
---|---|
A depiction of the Nativity with a Christmas tree backdrop. |
|
Also called | Noel Nativity Yule Xmas |
Observed by | Christians Many non-Christians[1] |
Type | Christian, cultural |
Significance | Traditional birthday of Jesus |
Date | December 25 (alternatively, January 6, 7 or 19)[2][3][4] (see below) |
Observances | Church services, gift giving, family and other social gatherings, symbolic decorating |
Related to | Christmastide, Christmas Eve, Advent, Annunciation, Epiphany, Baptism of the Lord, Yule |
Christmas or Christmas Day (Old English: Crīstesmæsse, literally "Christ's mass") is an annual commemoration of the birth of Jesus Christ,[5][6] celebrated generally on December 25[2][3][4] as a religious and cultural holiday by billions of people around the world. A feast central to the Christian liturgical year, it closes the Advent season and initiates the twelve days of Christmastide.[7] Christmas is a civil holiday in many of the world's nations,[8][9][10] is celebrated by an increasing number of non-Christians,[1][11][12] and is an integral part of the Christmas and holiday season.
The precise year of Jesus' birth, which some historians place between 7 and 2 BC, is unknown.[13][14] By the early-to-mid 4th century, Western Christianity had placed Christmas on December 25, a date later adopted in the East.[15][16] The date of Christmas may have initially been chosen to correspond with the day exactly nine months after the Annunciation, the date Christians believe Jesus to have been conceived,[17] (as well as the date of the southern solstice, i.e., the Roman winter solstice), with a sun connection being possible because Christians consider Jesus to be the "Sun of righteousness" prophesied in Malachi 4:2.[17][18][19][20][21]
The original date of the celebration in Eastern Christianity was January 6, in connection with Epiphany, and that is still the date of the celebration for the Armenian Apostolic Church and in Armenia, where it is a public holiday. As of 2012, there is a difference of 13 days between the modern Gregorian calendar and the older Julian calendar. Those who continue to use the Julian calendar or its equivalents thus celebrate December 25 and January 6 on what for the majority of the world is January 7 and January 19. For this reason, Ethiopia, Russia, Ukraine, Serbia, the Republic of Macedonia, and the Republic of Moldova celebrate Christmas on what in the Gregorian calendar is January 7; all the Greek Orthodox Churches celebrate Christmas on December 25.
The popular celebratory customs associated in various countries with Christmas have a mix of pre-Christian, Christian and secular themes and origins.[22] Popular modern customs of the holiday include gift giving, Christmas music and caroling, an exchange of Christmas cards, church celebrations, a special meal, and the display of various Christmas decorations, including Christmas trees, Christmas lights, nativity scenes, garlands, wreaths, mistletoe, and holly. In addition, several closely related and often interchangeable figures, known as Santa Claus, Father Christmas, Saint Nicholas and Christkind, are associated with bringing gifts to children during the Christmas season and have their own body of traditions and lore.[23] Because gift-giving and many other aspects of the Christmas festival involve heightened economic activity among both Christians and non-Christians, the holiday has become a significant event and a key sales period for retailers and businesses. The economic impact of Christmas is a factor that has grown steadily over the past few centuries in many regions of the world.
Contents |
The word "Christmas" originated as a compound meaning "Christ's mass". It is derived from the Middle English Cristemasse, which is from Old English Crīstesmæsse, a phrase first recorded in 1038.[6] Crīst (genitive Crīstes) is from Greek Khrīstos (Χριστός), a translation of Hebrew Māšîaḥ (מָשִׁיחַ), "Messiah"; and mæsse is from Latin missa, the celebration of the Eucharist. The form "Christenmas" was also historically used, but is now considered archaic and dialectal;[24] it derives from Middle English Cristenmasse, literally "Christian mass".[25] "Xmas" is an abbreviation of Christmas found particularly in print, based on the initial letter chi (Χ) in Greek Khrīstos (Χριστός), "Christ", though numerous style guides discourage its use;[26] it has precedent in Middle English Χρ̄es masse (where "Χρ̄" is an abbreviation for Χριστός).[25]
In addition to "Christmas", the holiday has been known by various other names throughout its history. The Anglo-Saxons referred to the feast as midwinter, "midwinter",[27][28] or, more rarely, as Nātiuiteð (from Latin nātīvitās below).[27][29] "Nativity", meaning "birth", is from Latin nātīvitās.[30] In Old English, Gēola ("Yule") referred to the period corresponding to January and December;[31] the cognate Old Norse Jól was later the name of a pagan Scandinavian holiday which merged with Christmas around 1000.[27] "Noel" (or "Nowell") entered English in the late 14th century and is from the Old French noël or naël, itself ultimately from the Latin nātālis (diēs), "(day) of birth".[32]
Christmas Day is celebrated as a major festival and public holiday in countries around the world, including many whose populations are mostly non-Christian. In some non-Christian countries, periods of former colonial rule introduced the celebration (e.g. Hong Kong); in others, Christian minorities or foreign cultural influences have led populations to observe the holiday. Countries such as Japan, where Christmas is popular despite there being only a small number of Christians, have adopted many of the secular aspects of Christmas, such as gift-giving, decorations and Christmas trees.
Countries in which Christmas is not a formal public holiday include China, (excepting Hong Kong and Macao), Japan, Saudi Arabia, Algeria, Thailand, Nepal, Iran, Turkey and North Korea. Christmas celebrations around the world can vary markedly in form, reflecting differing cultural and national traditions.
Among countries with a strong Christian tradition, a variety of Christmas celebrations have developed that incorporate regional and local cultures. For Christians, participating in a religious service plays an important part in the recognition of the season. Christmas, along with Easter, is the period of highest annual church attendance.
In Catholic countries, people hold religious processions or parades in the days preceding Christmas. In other countries, secular processions or parades featuring Santa Claus and other seasonal figures are often held. Family reunions and the exchange of gifts are a widespread feature of the season. Gift giving takes place on Christmas Day in most countries. Others practice gift giving on December 6, Saint Nicholas Day, and January 6, Epiphany.
In the earliest centuries of Christianity, no particular day of the year was associated with the birth of Jesus. Various dates were proposed: May 28, April 18 or 19, March 25, January 2, November 17 or 20.[33][6] When celebration on a particular date began, January 6 prevailed at least in the East;[34]but, except among Armenians (the Armenian Apostolic Church and the Armenian Catholic Church), who continue to celebrate the birth on January 6, December 25 eventually won acceptance everywhere.[33]
Today, whether or not the birth date of Jesus is on December 25 is not considered to be an important issue in mainstream Christian denominations;[35][36][37] rather, the belief that God came into the world in the form of man to atone for the sins of humanity is considered to be the primary purpose in celebrating Christmas.[35][36][37]
In the early 4th century, the church calendar in Rome contained Christmas on December 25 and other holidays placed on solar dates: "It is cosmic symbolism...which inspired the Church leadership in Rome to elect the southern solstice, December 25, as the birthday of Christ, and the northern solstice as that of John the Baptist, supplemented by the equinoxes as their respective dates of conception. While they were aware that pagans called this day the 'birthday' of Sol Invictus, this did not concern them and it did not play any role in their choice of date for Christmas," according to modern scholar S.E. Hijmans.[38]
Around the year 386 John Chrysostom delivered a sermon in Antioch in favour of adopting the 25 December celebration also in the East, since, he said, the conception of Jesus (Luke 1:26) had been announced during the sixth month of Elisabeth's pregnancy with John the Baptist (Luke 1:10-13), which he dated from the duties Zacharias performed on the Day of Atonement during the seventh month of the Hebrew calendar Ethanim or Tishri (Lev. 16:29, 1 Kings 8:2) which falls from late September to early October.[6] That shepherds watched the flocks by night in the fields in the winter time is supported by the phrase "frost by night" in Genesis 31:38-40. A special group known as the shepherds of Migdal Eder (Gen. 35:19-21, Micah 4:8) watched the flocks by night year round pastured for Temple Sacrifice near Bethlehem.[39][40]
In the early 18th century, some scholars proposed alternative explanations. Isaac Newton argued that the date of Christmas, celebrating the birth of him whom Christians consider to be the "Sun of righteousness" prophesied in Malachi 4:2,[18] was selected to correspond with the southern solstice, which the Romans called bruma, celebrated on December 25.[41] In 1743, German Protestant Paul Ernst Jablonski argued Christmas was placed on December 25 to correspond with the Roman solar holiday Dies Natalis Solis Invicti and was therefore a "paganization" that debased the true church.[21] It has been argued that, on the contrary, the Emperor Aurelian, who in 274 instituted the holiday of the Dies Natalis Solis Invicti, did so partly as an attempt to give a pagan significance to a date already important for Christians in Rome.[42] In 1889, Louis Duchesne proposed that the date of Christmas was calculated as nine months after the Annunciation, the traditional date of the conception of Jesus.[43][17]
Eastern Orthodox national churches, including those of Russia, Georgia, Ukraine, Macedonia, Montenegro, Serbia and the Greek Patriarchate of Jerusalem mark feasts using the older Julian calendar. December 25 on the Julian calendar currently corresponds to January 7 on the internationally-used Gregorian calendar. However, other Orthodox Christians, such as the churches of Bulgaria, Greece, Romania, Antioch, Alexandria, Albania, Finland and the Orthodox Church in America, among others, began using the Revised Julian calendar in the early 20th century, which corresponds exactly to the Gregorian calendar.[4]
Church or section | Date | Calendar | Gregorian date | Note |
---|---|---|---|---|
Armenian Patriarchate of Jerusalem | January 6 | Julian calendar | January 19 | Correspondence between Julian January 6 and Gregorian January 19 holds until 2100; in the following century the difference will be one day more. |
Armenian Apostolic Church and the Armenian Catholic Church | January 6 | Gregorian calendar | January 6 | |
Eastern Orthodox: Russia, Georgia, Ukraine, Macedonia, Montenegro, Serbia and the Greek Patriarchate of Jerusalem | December 25 | Julian calendar | January 7 | |
Other Eastern Orthodox Churches, including those of Bulgaria, Greece, Romania, Antioch, Alexandria, Albania, Finland and the Orthodox Church in America | December 25 | Revised Julian calendar | December 25 | Revised Julian calendar usage started in the early 20th century |
Coptic Orthodox Church of Alexandria | Koiak 29 (corresponding to Julian December 25 or 26) | Coptic calendar | January 7 or 8 | Since the Coptic calendar's leap day is inserted in what the Julian calendar considers September, the following Koiak 29 falls one day later than usual in the Julian and Gregorian calendars |
Ethiopian Orthodox Tewahedo Church | Tahsas 29 or 28 (corresponding to Julian December 25) | Ethiopian Calendar | January 7 | After the Ethiopian insertion of a leap day in what for the Julian calendar is September, Christmas is celebrated on Tahsas 28 in order to maintain the exact interval of 9 30-day months and 5 days of the child's gestation.[44] The Eritrean Orthodox Tewahedo Church uses the same calendar but, like the Coptic Church, does not make this adjustment. |
Western churches | December 25 | Gregorian calendar | December 25 |
Christians celebrate the birth of Jesus to the Virgin Mary as a fulfillment of the Old Testament's Messianic prophecy.[45] The Bible contains two accounts which describe the events surrounding Jesus' birth. Depending on one's perspective, these accounts either differ from each other or tell two versions of the same story.[46][47][48][49] These biblical accounts are found in the Gospel of Matthew, namely Matthew 1:18, and the Gospel of Luke, specifically Luke 1:26 and 2:40. According to these accounts, Jesus was born to Mary, assisted by her husband Joseph, in the city of Bethlehem.
According to popular tradition, the birth took place in a stable, surrounded by farm animals. A manger (that is, a feeding trough) is mentioned in Luke 2:7, where it states Mary "wrapped him in swaddling clothes and laid him in a manger, because there was no room for them in the inn" (KJV); and "She wrapped him in cloths and placed him in a manger, because there was no guest room available for them" (NIV). Shepherds from the fields surrounding Bethlehem were told of the birth by an angel, and were the first to see the child.[50] Popular tradition also holds that three kings or wise men (named Melchior, Caspar, and Balthazar) visited the infant Jesus in the manger, though this does not strictly follow the Biblical account. The Gospel of Matthew instead describes a visit by an unspecified number of magi, or astrologers, sometime after Jesus was born while the family was living in a house (Matthew 2:11), who brought gifts of gold, frankincense, and myrrh to the young child Jesus. The visitors were said to be following a mysterious star, commonly known as the Star of Bethlehem, believing it to announce the birth of a king of the Jews.[51] The commemoration of this visit, the Feast of Epiphany celebrated on January 6, is the formal end of the Christmas season in some churches.
Christians celebrate Christmas in various ways. In addition to this day being one of the most important and popular for the attendance of church services, there are other devotions and popular traditions. In some Christian denominations, children re-enact the events of the Nativity with animals to portray the event with more realism or sing carols that reference the event. Some Christians also display a small re-creation of the Nativity, known as a Nativity scene or crèche, in their homes, using figurines to portray the key characters of the event. Prior to Christmas Day, the Eastern Orthodox Church practices the 40-day Nativity Fast in anticipation of the birth of Jesus, while much of Western Christianity celebrates four weeks of Advent. The final preparations for Christmas are made on Christmas Eve, and many families' major observation of Christmas actually falls in the evening of this day.
A long artistic tradition has grown of producing painted depictions of the nativity in art. Nativity scenes are traditionally set in a stable with livestock and include Mary, Joseph, the infant Jesus in the manger, the three wise men, the shepherds and their sheep, the angels, and the Star of Bethlehem.[52]
The practice of putting up special decorations at Christmas has a long history. In the 15th century, it was recorded that in London it was the custom at Christmas for every house and all the parish churches to be "decked with holm, ivy, bays, and whatsoever the season of the year afforded to be green".[53] The heart-shaped leaves of ivy were said to symbolize the coming to earth of Jesus, while holly was seen as protection against pagans and witches, its thorns and red berries held to represent the Crown of Thorns worn by Jesus at the crucifixion and the blood he shed.[54][55]
Nativity scenes are known from 10th-century Rome. They were popularised by Saint Francis of Asissi from 1223, quickly spreading across Europe.[56] Different types of decorations developed across the Christian world, dependent on local tradition and available resources. The first commercially produced decorations appeared in Germany in the 1860s, inspired by paper chains made by children.[57] In countries where a representation of the Nativity Scene is very popular, people are encouraged to compete and create the most original or realistic ones. Within some families, the pieces used to make the representation are considered a valuable family heirloom.
The traditional colors of Christmas are green and red.[58] White, silver and gold are also popular. Red symbolizes the blood of Jesus, which was shed in his crucifixion, while green symbolizes eternal life, and in particular the evergreen tree, which does not lose its leaves in the winter.[55][58]
The Christmas tree is considered by some as Christianisation of pagan tradition and ritual surrounding the Winter Solstice, which included the use of evergreen boughs, and an adaptation of pagan tree worship;[59] according to eighth-century biographer Æddi Stephanus, Saint Boniface (634-709), who was a missionary in Germany, took an axe to an oak tree dedicated to Thor and pointed out a fir tree, which he stated was a more fitting object of reverence because it pointed to heaven and it had a triangular shape, which he said was symbolic of the Trinity.[60] The English language phrase "Christmas tree" is first recorded in 1835[61] and represents an importation from the German language. The modern Christmas tree tradition is believed to have begun in Germany in the 18th century[59] though many argue that Martin Luther began the tradition in the 16th century.[62][63]
From Germany the custom was introduced to Britain, first via Queen Charlotte, wife of George III, and then more successfully by Prince Albert during the reign of Queen Victoria. By 1841 the Christmas tree had become even more widespread throughout Britain.[64] By the 1870s, people in the United States had adopted the custom of putting up a Christmas tree.[65] Christmas trees may be decorated with lights and ornaments.
Since the 19th century, the poinsettia, a native plant from Mexico, has been associated with Christmas. Other popular holiday plants include holly, mistletoe, red amaryllis, and Christmas cactus. Along with a Christmas tree, the interior of a home may be decorated with these plants, along with garlands and evergreen foliage. The display of Christmas villages has also become a tradition in many homes during this season. The outside of houses may be decorated with lights and sometimes with illuminated sleighs, snowmen, and other Christmas figures.
Other traditional decorations include bells, candles, candy canes, stockings, wreaths, and angels. Both the displaying of wreaths and candles in each window are a more traditional Christmas display. The concentric assortment of leaves, usually from an evergreen, make up Christmas wreaths and are designed to prepare Christians for the Advent season. Candles in each window are meant to demonstrate the fact that Christians believe that Jesus Christ is the ultimate light of the world.[66] Both of these antiquated, more subdued, Christmas displays are seen in the image to the right of Saint Anselm College.
Christmas lights and banners may be hung along streets, music played from speakers, and Christmas trees placed in prominent places.[67] It is common in many parts of the world for town squares and consumer shopping areas to sponsor and display decorations. Rolls of brightly colored paper with secular or religious Christmas motifs are manufactured for the purpose of wrapping gifts. In some countries, Christmas decorations are traditionally taken down on Twelfth Night, the evening of January 5.
The earliest extant specifically Christmas hymns appear in 4th century Rome. Latin hymns such as Veni redemptor gentium, written by Ambrose, Archbishop of Milan, were austere statements of the theological doctrine of the Incarnation in opposition to Arianism. Corde natus ex Parentis (Of the Father's love begotten) by the Spanish poet Prudentius (d. 413) is still sung in some churches today.[68]
In the 9th and 10th centuries, the Christmas "Sequence" or "Prose" was introduced in North European monasteries, developing under Bernard of Clairvaux into a sequence of rhymed stanzas. In the 12th century the Parisian monk Adam of St. Victor began to derive music from popular songs, introducing something closer to the traditional Christmas carol.
By the 13th century, in France, Germany, and particularly, Italy, under the influence of Francis of Asissi, a strong tradition of popular Christmas songs in the native language developed.[69] Christmas carols in English first appear in a 1426 work of John Awdlay, a Shropshire chaplain, who lists twenty-five "caroles of Cristemas", probably sung by groups of wassailers, who went from house to house.[70]
The songs we know specifically as carols were originally communal folk songs sung during celebrations such as "harvest tide" as well as Christmas. It was only later that carols began to be sung in church. Traditionally, carols have often been based on medieval chord patterns, and it is this that gives them their uniquely characteristic musical sound. Some carols like "Personent hodie", "Good King Wenceslas", and "The Holly and the Ivy" can be traced directly back to the Middle Ages. They are among the oldest musical compositions still regularly sung. Adeste Fidelis (O Come all ye faithful) appears in its current form in the mid-18th century, although the words may have originated in the 13th century.
Singing of carols initially suffered a decline in popularity after the Protestant Reformation in northern Europe, although some Reformers, like Martin Luther, wrote carols and encouraged their use in worship. Carols largely survived in rural communities until the revival of interest in popular songs in the 19th century. The 18th century English reformer Charles Wesley understood the importance of music to worship. In addition to setting many psalms to melodies, which were influential in the Great Awakening in the United States, he wrote texts for at least three Christmas carols. The best known was originally entitled "Hark! How All the Welkin Rings", later renamed "Hark! the Herald Angels Sing".[71]
Felix Mendelssohn wrote a melody adapted to fit Wesley's words. In Austria in 1818 Mohr and Gruber made a major addition to the genre when they composed "Silent Night" for the St. Nicholas Church, Oberndorf. William B. Sandys' Christmas Carols Ancient and Modern (1833) contained the first appearance in print of many now-classic English carols, and contributed to the mid-Victorian revival of the festival.[72]
Completely secular Christmas seasonal songs emerged in the late 18th century. "Deck The Halls" dates from 1784, and the American "Jingle Bells" was copyrighted in 1857. In the 19th and 20th century, African American spirituals and songs about Christmas, based in their tradition of spirituals, became more widely known. An increasing number of seasonal holidays songs were commercially produced in the 20th century, including jazz and blues variations. In addition, there was a revival of interest in early music, from groups singing folk music, such as The Revels, to performers of early medieval and classical music.
A special Christmas family meal is traditionally an important part of the holiday's celebration, and the food that is served varies greatly from country to country. Some regions, such as Sicily, have special meals for Christmas Eve, when 12 kinds of fish are served. In England and countries influenced by its traditions, a standard Christmas meal includes turkey or goose, meat, gravy, potatoes, vegetables, sometimes bread and cider. Special desserts are also prepared, such as Christmas pudding, mince pies and fruit cake.[73][74]
In Poland and other parts of eastern Europe and Scandinavia, fish often is used for the traditional main course, but richer meat such as lamb is increasingly served. In Germany, France and Austria, goose and pork are favored. Beef, ham and chicken in various recipes are popular throughout the world. The Maltese traditionally serve Imbuljuta tal-Qastan,[75] a chocolate and chestnuts beverage, after Midnight Mass and throughout the Christmas season. Slovaks prepare the traditional Christmas bread potica, bûche de Noël in France, panettone in Italy, and elaborate tarts and cakes. The eating of sweets and chocolates has become popular worldwide, and sweeter Christmas delicacies include the German stollen, marzipan cake or candy, and Jamaican rum fruit cake. As one of the few fruits traditionally available to northern countries in winter, oranges have been long associated with special Christmas foods.
Christmas cards are illustrated messages of greeting exchanged between friends and family members during the weeks preceding Christmas Day. The traditional greeting reads "wishing you a Merry Christmas and a Happy New Year", much like that of the first commercial Christmas card, produced by Sir Henry Cole in London in 1843.[76] The custom of sending them has become popular among a wide cross-section of people with the emergence of the modern trend towards exchanging E-cards.
Christmas cards are purchased in considerable quantities, and feature artwork, commercially designed and relevant to the season. The content of the design might relate directly to the Christmas narrative with depictions of the Nativity of Jesus, or Christian symbols such as the Star of Bethlehem, or a white dove which can represent both the Holy Spirit and Peace on Earth. Other Christmas cards are more secular and can depict Christmas traditions, mythical figures such as Santa Claus, objects directly associated with Christmas such as candles, holly and baubles, or a variety of images associated with the season, such as Christmastide activities, snow scenes and the wildlife of the northern winter. There are even humorous cards and genres depicting nostalgic scenes of the past such as crinolined shoppers in idealized 19th century streetscapes.
Some prefer cards with a poem, prayer or Biblical verse; while others distance themselves from religion with an all-inclusive "Season's greetings".
A number of nations have issued commemorative stamps at Christmastide. Postal customers will often use these stamps to mail Christmas cards, and they are popular with philatelists. These stamps are regular postage stamps, unlike Christmas seals, and are valid for postage year-round. They usually go on sale some time between early October and early December, and are printed in considerable quantities.
In 1898 a Canadian stamp was issued to mark the inauguration of the Imperial Penny Postage rate. The stamp features a map of the globe and bears an inscription "XMAS 1898" at the bottom. In 1937, Austria issued two "Christmas greeting stamps" featuring a rose and the signs of the zodiac. In 1939, Brazil issued four semi-postal stamps with designs featuring the three kings and a star of Bethlehem, an angel and child, the Southern Cross and a child, and a mother and child.
Both the US Postal Service and the Royal Mail regularly issue Christmas-themed stamps each year.
The exchanging of gifts is one of the core aspects of the modern Christmas celebration, making the Christmas season the most profitable time of year for retailers and businesses throughout the world. Gift giving was common in the Roman celebration of Saturnalia, an ancient festival which took place in late December and may have influenced Christmas customs.[77] On Christmas, Christians exchange gifts on the basis that the tradition is associated St. Nicholas with Christmas,[78] and that gifts of gold, frankincense and myrrh were given to the infant Jesus by the Biblical Magi.[79][80]
A number of figures are associated with Christmas and the seasonal giving of gifts. Among these are Father Christmas, also known as Santa Claus (derived from the Dutch for Saint Nicholas), Père Noël, and the Weihnachtsmann; Saint Nicholas or Sinterklaas; the Christkind; Kris Kringle; Joulupukki; Babbo Natale; Saint Basil; and Father Frost.
The best known of these figures today is red-dressed Santa Claus, of diverse origins. The name Santa Claus can be traced back to the Dutch Sinterklaas, which means simply Saint Nicholas. Nicholas was Bishop of Myra, in modern day Turkey, during the 4th century. Among other saintly attributes, he was noted for the care of children, generosity, and the giving of gifts. His feast on December 6 came to be celebrated in many countries with the giving of gifts.[81]
Saint Nicholas traditionally appeared in bishop's attire, accompanied by helpers, inquiring about the behaviour of children during the past year before deciding whether they deserved a gift or not. By the 13th century, Saint Nicholas was well known in the Netherlands, and the practice of gift-giving in his name spread to other parts of central and southern Europe. At the Reformation in 16th–17th century Europe, many Protestants changed the gift bringer to the Christ Child or Christkindl, corrupted in English to Kris Kringle, and the date of giving gifts changed from December 6 to Christmas Eve.[81]
The modern popular image of Santa Claus, however, was created in the United States, and in particular in New York. The transformation was accomplished with the aid of notable contributors including Washington Irving and the German-American cartoonist Thomas Nast (1840–1902). Following the American Revolutionary War, some of the inhabitants of New York City sought out symbols of the city's non-English past. New York had originally been established as the Dutch colonial town of New Amsterdam and the Dutch Sinterklaas tradition was reinvented as Saint Nicholas.[82]
In 1809, the New-York Historical Society convened and retroactively named Sancte Claus the patron saint of Nieuw Amsterdam, the Dutch name for New York City.[83] At his first American appearance in 1810, Santa Claus was drawn in bishops' robes. However as new artists took over, Santa Claus developed more secular attire.[84] Nast drew a new image of "Santa Claus" annually, beginning in 1863. By the 1880s, Nast's Santa had evolved into the robed, fur clad, form we now recognize, perhaps based on the English figure of Father Christmas. The image was standardized by advertisers in the 1920s.[85]
Father Christmas, a jolly, well nourished, bearded man who typified the spirit of good cheer at Christmas, predates the Santa Claus character. He is first recorded in early 17th century England, but was associated with holiday merrymaking and drunkenness rather than the bringing of gifts.[61] In Victorian Britain, his image was remade to match that of Santa. The French Père Noël evolved along similar lines, eventually adopting the Santa image. In Italy, Babbo Natale acts as Santa Claus, while La Befana is the bringer of gifts and arrives on the eve of the Epiphany. It is said that La Befana set out to bring the baby Jesus gifts, but got lost along the way. Now, she brings gifts to all children. In some cultures Santa Claus is accompanied by Knecht Ruprecht, or Black Peter. In other versions, elves make the toys. His wife is referred to as Mrs. Claus.
There has been some opposition to the narrative of the American evolution of Saint Nicholas into the modern Santa. It has been claimed that the Saint Nicholas Society was not founded until 1835, almost half a century after the end of the American War of Independence.[86] Moreover, a study of the "children's books, periodicals and journals" of New Amsterdam by Charles Jones revealed no references to Saint Nicholas or Sinterklaas.[87] However, not all scholars agree with Jones's findings, which he reiterated in a booklength study in 1978;[88] Howard G. Hageman, of New Brunswick Theological Seminary, maintains that the tradition of celebrating Sinterklaas in New York was alive and well from the early settlement of the Hudson Valley on.[89]
Current tradition in several Latin American countries (such as Venezuela and Colombia) holds that while Santa makes the toys, he then gives them to the Baby Jesus, who is the one who actually delivers them to the children's homes, a reconciliation between traditional religious beliefs and the iconography of Santa Claus imported from the United States.
In South Tyrol (Italy), Austria, Czech Republic, Southern Germany, Hungary, Liechtenstein, Slovakia and Switzerland, the Christkind (Ježíšek in Czech, Jézuska in Hungarian and Ježiško in Slovak) brings the presents. Greek children get their presents from Saint Basil on New Year's Eve, the eve of that saint's liturgical feast.[90] The German St. Nikolaus is not identical with the Weihnachtsmann (who is the German version of Santa Claus/Father Christmas). St. Nikolaus wears a bishop's dress and still brings small gifts (usually candies, nuts and fruits) on December 6 and is accompanied by Knecht Ruprecht. Although many parents around the world routinely teach their children about Santa Claus and other gift bringers, some have come to reject this practice, considering it deceptive.[91]
The earliest evidence of the celebration on December 25 of a Christian liturgical feast of the birth of Jesus is from the Chronography of 354 AD. This was in Rome, while in Eastern Christianity the birth of Jesus was already celebrated in connection with the Epiphany on January 6.[92][93] The December 25 celebration was imported into the East later: in Antioch by John Chrysostom towards the end of the 4th century,[93] probably in 388, and in Alexandria only in the following century.[94] Even in the West, the January 6 celebration of the nativity of Jesus seems to have continued until after 380.[95]
Many popular customs associated with Christmas developed independently of the commemoration of Jesus' birth, with certain elements having origins in pre-Christian festivals that were celebrated around the winter solstice by pagan populations who were later converted to Christianity. These elements, including the Yule log from Yule and gift giving from Saturnalia,[77] became syncretized into Christmas over the centuries. The prevailing atmosphere of Christmas has also continually evolved since the holiday's inception, ranging from a sometimes raucous, drunken, carnival-like state in the Middle Ages,[96] to a tamer family-oriented and children-centered theme introduced in a 19th-century reformation.[97][98] Additionally, the celebration of Christmas was banned on more than one occasion within Protestant Christendom due to concerns that it was too pagan or unbiblical.[99][100]
Dies Natalis Solis Invicti means "the birthday of the unconquered sun".
Some early Christian writers connected the sun to the birth of Jesus, which Christians believe was prophesied in Malachi 4:2 as the "Sun of Righteousness."[6] "O, how wonderfully acted Providence that on that day on which that Sun was born...Christ should be born", Cyprian wrote.[6] In the fourth century, John Chrysostom commented on the connection: "But Our Lord, too, is born in the month of December . . . the eight before the calends of January [25 December] . . ., But they call it the 'Birthday of the Unconquered'. Who indeed is so unconquered as Our Lord . . .? Or, if they say that it is the birthday of the Sun, He is the Sun of Justice."[6]
One ancient source mentioned Dies Natalis Solis Invicti in the Chronography of 354, and Sol scholar Steven Hijmans stated that there is no evidence that the celebration precedes that of Christmas:[38] "[W]hile the winter solstice on or around December 25 was well established in the Roman imperial calendar, there is no evidence that a religious celebration of Sol on that day antedated the celebration of Christmas, and none that indicates that Aurelian had a hand in its institution."[38]
A winter festival was the most popular festival of the year in many cultures. Reasons included the fact that less agricultural work needs to be done during the winter, as well as an expectation of better weather as spring approached.[102] Modern Christmas customs include: gift-giving and merrymaking from Roman Saturnalia; greenery, lights, and charity from the Roman New Year; and Yule logs and various foods from Germanic feasts.[103]
Pagan Scandinavia celebrated a winter festival called Yule, held in the late December to early January period.[citation needed] As Northern Europe was the last part to Christianize, its pagan traditions had a major influence on Christmas, especially Koleda,[104] which was incorporated into the Christmas carol. Scandinavians still call Christmas Jul. In English, the word Yule is synonymous with Christmas,[105] a usage first recorded in 900.
The New Testament Gospel of Luke may indirectly give the date as December for the birth of Jesus, with the sixth month of Elizabeth's pregnancy with John the Baptist cited by John Chrysostom (c. 386) as a date for the Annunciation.[6][17][40][106] Tertullian (d. 220) did not mention Christmas as a major feast day in the Church of Roman Africa.[6] In Chronographai, a reference work published in 221, Sextus Julius Africanus suggested that Jesus was conceived on the spring equinox.[107][108] The equinox was March 25 on the Roman calendar, so this implied a birth in December.[109]
In 245, the theologian Origen of Alexandria stated that, "only sinners (like Pharaoh and Herod)" celebrated their birthdays.[110] In 303, Christian writer Arnobius ridiculed the idea of celebrating the birthdays of gods, a passage cited as evidence that Arnobius was unaware of any nativity celebration.[111] Since Christmas does not celebrate Christ's birth "as God" but "as man", this is not evidence against Christmas being a feast at this time.[6] The fact the Donatists of North Africa celebrated Christmas may indicate that the feast was established by the time that church was created in 311.
The earliest known reference to the date of the nativity as December 25 is found in the Chronography of 354, an illuminated manuscript compiled in Rome.[112] In the East, early Christians celebrated the birth of Christ as part of Epiphany (January 6), although this festival emphasized celebration of the baptism of Jesus.[113]
Christmas was promoted in the Christian East as part of the revival of Catholicism following the death of the pro-Arian Emperor Valens at the Battle of Adrianople in 378. The feast was introduced to Constantinople in 379, and to Antioch in about 380. The feast disappeared after Gregory of Nazianzus resigned as bishop in 381, although it was reintroduced by John Chrysostom in about 400.[6]
In the Early Middle Ages, Christmas Day was overshadowed by Epiphany, which in western Christianity focused on the visit of the magi. But the medieval calendar was dominated by Christmas-related holidays. The forty days before Christmas became the "forty days of St. Martin" (which began on November 11, the feast of St. Martin of Tours), now known as Advent.[96] In Italy, former Saturnalian traditions were attached to Advent.[96] Around the 12th century, these traditions transferred again to the Twelve Days of Christmas (December 25 – January 5); a time that appears in the liturgical calendars as Christmastide or Twelve Holy Days.[96]
The prominence of Christmas Day increased gradually after Charlemagne was crowned Emperor on Christmas Day in 800. King Edmund the Martyr was anointed on Christmas in 855 and King William I of England was crowned on Christmas Day 1066.
By the High Middle Ages, the holiday had become so prominent that chroniclers routinely noted where various magnates celebrated Christmas. King Richard II of England hosted a Christmas feast in 1377 at which twenty-eight oxen and three hundred sheep were eaten.[96] The Yule boar was a common feature of medieval Christmas feasts. Caroling also became popular, and was originally a group of dancers who sang. The group was composed of a lead singer and a ring of dancers that provided the chorus. Various writers of the time condemned caroling as lewd, indicating that the unruly traditions of Saturnalia and Yule may have continued in this form.[96] "Misrule"—drunkenness, promiscuity, gambling—was also an important aspect of the festival. In England, gifts were exchanged on New Year's Day, and there was special Christmas ale.[96]
Christmas during the Middle Ages was a public festival that incorporated ivy, holly, and other evergreens.[114] Christmas gift-giving during the Middle Ages was usually between people with legal relationships, such as tenant and landlord.[114] The annual indulgence in eating, dancing, singing, sporting, and card playing escalated in England, and by the 17th century the Christmas season featured lavish dinners, elaborate masques and pageants. In 1607, King James I insisted that a play be acted on Christmas night and that the court indulge in games.[115] It was during the Reformation in 16th–17th century Europe that many Protestants changed the gift bringer to the Christ Child or Christkindl, and the date of giving gifts changed from December 6 to Christmas Eve.[81]
Following the Protestant Reformation, groups such as the Puritans strongly condemned the celebration of Christmas, considering it a Catholic invention and the "trappings of popery" or the "rags of the Beast."[99] The Catholic Church responded by promoting the festival in a more religiously oriented form. King Charles I of England directed his noblemen and gentry to return to their landed estates in midwinter to keep up their old style Christmas generosity.[115] Following the Parliamentarian victory over Charles I during the English Civil War, England's Puritan rulers banned Christmas in 1647.[99]
Protests followed as pro-Christmas rioting broke out in several cities and for weeks Canterbury was controlled by the rioters, who decorated doorways with holly and shouted royalist slogans.[99] The book, The Vindication of Christmas (London, 1652), argued against the Puritans, and makes note of Old English Christmas traditions, dinner, roast apples on the fire, card playing, dances with "plow-boys" and "maidservants", and carol singing.[116] The Restoration of King Charles II in 1660 ended the ban, but many clergymen still disapproved of Christmas celebration. In Scotland, the Presbyterian Church of Scotland also discouraged the observance of Christmas, and though James VI commanded its celebration in 1618, attendance at church was scant.[117] The Parliament of Scotland officially abolished the observance of Christmas in 1640, claiming that the church had been "purged of all superstitious observation of days".[118] It was not until 1958 that Christmas again became a Scottish public holiday.[119]
In Colonial America, the Puritans of New England shared radical Protestant disapproval of Christmas. Celebration was outlawed in Boston from 1659 to 1681. The ban by the Pilgrims was revoked in 1681 by English governor Sir Edmund Andros, however it was not until the mid-19th century that celebrating Christmas became fashionable in the Boston region.[100]
At the same time, Christian residents of Virginia and New York observed the holiday freely. Pennsylvania German Settlers, pre-eminently the Moravian settlers of Bethlehem, Nazareth and Lititz in Pennsylvania and the Wachovia Settlements in North Carolina, were enthusiastic celebrators of Christmas. The Moravians in Bethlehem had the first Christmas trees in America as well as the first Nativity Scenes.[120] Christmas fell out of favor in the United States after the American Revolution, when it was considered an English custom.[121] George Washington attacked Hessian (German) mercenaries on the day after Christmas during the Battle of Trenton on December 26, 1776, Christmas being much more popular in Germany than in America at this time.
In the early 19th century, writers imagined Tudor Christmas as a time of heartfelt celebration. In 1843, Charles Dickens wrote the novel A Christmas Carol, that helped revive the 'spirit' of Christmas and seasonal merriment.[97][98] Its instant popularity played a major role in portraying Christmas as a holiday emphasizing family, goodwill, and compassion.[122]
Dickens sought to construct Christmas as a family-centered festival of generosity, in contrast to the community-based and church-centered observations, the observance of which had dwindled during the late 18th century and early 19th century.[123] Superimposing his secular vision of the holiday, Dickens influenced many aspects of Christmas that are celebrated today in Western culture, such as family gatherings, seasonal food and drink, dancing, games, and a festive generosity of spirit.[124] A prominent phrase from the tale, 'Merry Christmas', was popularized following the appearance of the story.[125] This coincided with the appearance of the Oxford Movement and the growth of Anglo-Catholicism, which led a revival in traditional rituals and religious observances.[126]
The term Scrooge became a synonym for miser, with 'Bah! Humbug!' dismissive of the festive spirit.[127] In 1843, the first commercial Christmas card was produced by Sir Henry Cole.[128] The revival of the Christmas Carol began with William B. Sandys Christmas Carols Ancient and Modern (1833), with the first appearance in print of 'The First Noel', 'I Saw Three Ships', 'Hark the Herald Angels Sing' and 'God Rest Ye Merry, Gentlemen', popularized in Dickens' A Christmas Carol.
In Britain, the Christmas tree was introduced in the early 19th century following the personal union with the Kingdom of Hanover, by Charlotte of Mecklenburg-Strelitz, Queen to King George III. In 1832 a young Queen Victoria wrote about her delight at having a Christmas tree, hung with lights, ornaments, and presents placed round it.[129] After her marriage to her German cousin Prince Albert, by 1841 the custom became more widespread throughout Britain.[64]
An image of the British royal family with their Christmas tree at Windsor Castle, created a sensation when it was published in the Illustrated London News in 1848. A modified version of this image was published in the United States in 1850.[65][130] By the 1870s, putting up a Christmas tree had become common in America.[65]
In America, interest in Christmas had been revived in the 1820s by several short stories by Washington Irving which appear in his The Sketch Book of Geoffrey Crayon and "Old Christmas". Irving's stories depicted harmonious warm-hearted English Christmas festivities he experienced while staying in Aston Hall, Birmingham, England, that had largely been abandoned,[131] and he used the tract Vindication of Christmas (1652) of Old English Christmas traditions, that he had transcribed into his journal as a format for his stories.[115]
In 1822, Clement Clarke Moore wrote the poem A Visit From St. Nicholas (popularly known by its first line: Twas the Night Before Christmas).[132] The poem helped popularize the tradition of exchanging gifts, and seasonal Christmas shopping began to assume economic importance.[133] This also started the cultural conflict of the holiday's spiritualism and its commercialism that some see as corrupting the holiday. In her 1850 book "The First Christmas in New England", Harriet Beecher Stowe includes a character who complains that the true meaning of Christmas was lost in a shopping spree.[134]
While the celebration of Christmas was not yet customary in some regions in the U.S., Henry Wadsworth Longfellow detected "a transition state about Christmas here in New England" in 1856. "The old puritan feeling prevents it from being a cheerful, hearty holiday; though every year makes it more so".[135] In Reading, Pennsylvania, a newspaper remarked in 1861, "Even our presbyterian friends who have hitherto steadfastly ignored Christmas — threw open their church doors and assembled in force to celebrate the anniversary of the Savior's birth".[135]
The First Congregational Church of Rockford, Illinois, 'although of genuine Puritan stock', was 'preparing for a grand Christmas jubilee', a news correspondent reported in 1864.[135] By 1860, fourteen states including several from New England had adopted Christmas as a legal holiday.[136] In 1870, Christmas was formally declared a United States Federal holiday, signed into law by President Ulysses S. Grant.[136] Subsequently, in 1875, Louis Prang introduced the Christmas card to Americans. He has been called the "father of the American Christmas card".[137]
Throughout the holiday's history, Christmas has been the subject of controversy and attacks from various sources. The first documented Christmas controversy was Puritan led, and began during the English Interregnum, when England was ruled by a Puritan Parliament.[138] Puritans sought to remove the remaining pagan elements of Christmas. During this brief period, the Puritan led English Parliament banned the celebration of Christmas entirely, considering it "a popish festival with no biblical justification", and a time of wasteful and immoral behavior.[139] In Colonial America, the Puritans outlawed celebration of Christmas in 1659.[140]
Christians and defenders of religious freedom have claimed that attacks on Christmas continue in the present-day (dubbed a "war on Christmas").[141][142] One controversy is the occurrence of Christmas trees being renamed Holiday trees.[141] In the United States there has been a tendency to replace the greeting Merry Christmas with Happy Holidays.[143] Groups such as the American Civil Liberties Union have initiated court cases to bar the display of images and other material referring to Christmas from public property, including schools.[144] Such groups argue that government-funded displays of Christmas imagery and traditions violate the First Amendment to the United States Constitution, which prohibits the establishment by Congress of a national religion.[145] In 1984, the U.S. Supreme Court ruled in Lynch vs. Donnelly that a Christmas display (which included a Nativity scene) owned and displayed by the city of Pawtucket, Rhode Island did not violate the First Amendment.[146]
In November 2009, the Federal appeals court in Philadelphia endorsed a school district's ban on the singing of Christmas carols.[147] In the private sphere also, it has been alleged that any specific mention of the term "Christmas" or its religious aspects was being increasingly censored, avoided, or discouraged by a number of advertisers and retailers. In response, the American Family Association and other groups have organized boycotts of individual retailers.[148]
In the United Kingdom there have been some minor controversies, one of the most famous being the temporary promotion of the Christmas period as Winterval by Birmingham City Council in 1998.[149] Critics attacked the use of the word Winterval as political correctness gone mad, accusing council officials of trying to take the Christ out of Christmas.[149] The council responded to the criticism by stating that Christmas-related words and symbols were prominent in its publicity material.[149] There were also protests in November 2009 when the city council of Dundee promoted its celebrations as the Winter Night Light festival, initially with no specific Christmas references.[150]
Christmas is typically the largest annual economic stimulus for many nations around the world. Sales increase dramatically in almost all retail areas and shops introduce new products as people purchase gifts, decorations, and supplies. In the U.S., the "Christmas shopping season" starts as early as October.[151][152] In Canada, merchants begin advertising campaigns just before Halloween (October 31), and step up their marketing following Remembrance Day on November 11. In the UK and Ireland, the Christmas shopping season starts from mid November, around the time when high street Christmas lights are turned on.[153][154] In the United States, it has been calculated that a quarter of all personal spending takes place during the Christmas/holiday shopping season.[155] Figures from the U.S. Census Bureau reveal that expenditure in department stores nationwide rose from $20.8 billion in November 2004 to $31.9 billion in December 2004, an increase of 54 percent. In other sectors, the pre-Christmas increase in spending was even greater, there being a November – December buying surge of 100 percent in bookstores and 170 percent in jewelry stores. In the same year employment in American retail stores rose from 1.6 million to 1.8 million in the two months leading up to Christmas.[156] Industries completely dependent on Christmas include Christmas cards, of which 1.9 billion are sent in the United States each year, and live Christmas Trees, of which 20.8 million were cut in the U.S. in 2002.[157] In the UK in 2010, up to £8 billion was expected to be spent online at Christmas, approximately a quarter of total retail festive sales.[154]
In most Western nations, Christmas Day is the least active day of the year for business and commerce; almost all retail, commercial and institutional businesses are closed, and almost all industries cease activity (more than any other day of the year), whether laws require such or not. In England and Wales, the Christmas Day (Trading) Act 2004 prevents all large shops from trading on Christmas Day. Scotland is currently planning similar legislation. Film studios release many high-budget movies during the holiday season, including Christmas films, fantasy movies or high-tone dramas with high production values.
One economist's analysis calculates that, despite increased overall spending, Christmas is a deadweight loss under orthodox microeconomic theory, because of the effect of gift-giving. This loss is calculated as the difference between what the gift giver spent on the item and what the gift receiver would have paid for the item. It is estimated that in 2001, Christmas resulted in a $4 billion deadweight loss in the U.S. alone.[158][159] Because of complicating factors, this analysis is sometimes used to discuss possible flaws in current microeconomic theory. Other deadweight losses include the effects of Christmas on the environment and the fact that material gifts are often perceived as white elephants, imposing cost for upkeep and storage and contributing to clutter.[160]
See more Wikipedia articles related to Christmas. |
Find more about Christmas on Wikipedia's sister projects: | |
Definitions and translations from Wiktionary |
|
Images and media from Commons |
|
Learning resources from Wikiversity |
|
News stories from Wikinews |
|
Quotations from Wikiquote |
|
Source texts from Wikisource |
|
Textbooks from Wikibooks |
|
The Right Honourable Stephen Harper PC MP |
|
---|---|
22nd Prime Minister of Canada | |
Incumbent | |
Assumed office February 6, 2006 |
|
Monarch | Elizabeth II |
Preceded by | Paul Martin |
Leader of the Opposition | |
In office March 20, 2004 – February 6, 2006 |
|
Monarch | Elizabeth II |
Prime Minister | Paul Martin |
Preceded by | Grant Hill (Acting) |
Succeeded by | Bill Graham (Acting) |
In office May 21, 2002 – January 8, 2004 |
|
Monarch | Elizabeth II |
Prime Minister | Jean Chrétien Paul Martin |
Preceded by | John Reynolds (Acting) |
Succeeded by | Grant Hill (Acting) |
Member of the House of Commons of Canada |
|
Incumbent | |
Assumed office June 28, 2002 |
|
Preceded by | Preston Manning |
Constituency | Calgary Southwest |
In office October 25, 1993 – June 2, 1997 |
|
Preceded by | James Hawkes |
Succeeded by | Rob Anders |
Constituency | Calgary West |
Personal details | |
Born | (1959-04-30) April 30, 1959 (age 53) Toronto, Ontario |
Political party | Conservative Party (2003–present) |
Other political affiliations |
Liberal Party (Before 1985) Progressive Conservative Party (1985–1986) Reform Party (1987–1997) Canadian Alliance (2002–2003) |
Spouse(s) | Laureen Teskey (m. 1993-present) |
Children | Benjamin, Rachel |
Residence | 24 Sussex Drive, Ottawa, Ontario (Official) Calgary, Alberta (Private) |
Alma mater | University of Calgary |
Profession | Economist[1] |
Religion | Christian and Missionary Alliance |
Language | English |
Signature | |
Website | Official website |
Stephen Joseph Harper, PC MP (born April 30, 1959) is the 22nd and current Prime Minister of Canada and leader of the Conservative Party. Harper became prime minister when his party formed a minority government after the 2006 federal election. He is the first prime minister from the newly reconstituted Conservative Party, following a merger of the Progressive Conservative and Canadian Alliance parties.
Harper has been the Member of Parliament (MP) for the riding of Calgary Southwest in Alberta since 2002. Earlier, from 1993 to 1997, he was the MP for Calgary West. He was one of the founding members of the Reform Party, but did not seek re-election, and instead joined, and shortly thereafter led, the National Citizens Coalition.[2] In 2002, he succeeded Stockwell Day as leader of the Canadian Alliance (the successor to the Reform Party) and returned to parliament as Leader of the Opposition. In 2003, he reached an agreement with Progressive Conservative leader Peter MacKay for the merger of their two parties to form the Conservative Party of Canada. He was elected as the party's first non-interim leader in March 2004.
Harper's Conservative Party won a stronger minority in the October 2008 federal election, showing a small increase in the percentage of the popular vote and increased representation in the Canadian House of Commons, with 143 of 308 seats. The 40th Canadian Parliament was dissolved in March 2011, after a no-confidence vote that found the Cabinet in contempt of parliament was passed by the opposition parties.[3]
In the May 2011 federal election, Harper's Conservative Party won a majority government, the first since the 2000 federal election. The Party won 166 seats, an increase of 23 seats from the October 2008 election.
Contents
|
Harper was born in Toronto, the first of three sons of Margaret (née Johnston) and Joseph Harris Harper, an accountant at Imperial Oil.[4] He attended Northlea Public School and, later, John G. Althouse Middle School and Richview Collegiate Institute, both in Central Etobicoke. He graduated in 1978, and was a member of Richview Collegiate's team on Reach for the Top, a television quiz show for Canadian high school students.[5] Harper then enrolled at the University of Toronto but dropped out after two months.[6] He then moved to Edmonton, Alberta, where he found work in the mail room at Imperial Oil.[6] Later, he advanced to work on the company's computer systems. He took up post-secondary studies again at the University of Calgary, where he completed a bachelor's degree in economics. He later returned there to earn a master's degree in economics, completed in 1993. Harper has kept strong links to the University of Calgary, and often lectured there. He is the most recent prime minister since Joe Clark without a law degree.
Harper became involved in politics as a member of his high school's Young Liberals Club.[7] He later changed his political allegiance because he disagreed with the National Energy Program (NEP) of Pierre Trudeau's Liberal government.[8] He became chief aide to Progressive Conservative MP Jim Hawkes in 1985, but later became disillusioned with both the party and the government of Brian Mulroney, especially the administration's fiscal policy[7] and its inability to fully revoke the NEP until 1986. He left the PC Party that same year.[9]
He was then recommended by the University of Calgary's economist Bob Mansell to Preston Manning, the founder and leader of the Reform Party of Canada. Manning invited him to participate in the party, and Harper gave a speech at Reform's 1987 founding convention in Winnipeg. He became the Reform Party's Chief Policy Officer, and he played a major role in drafting the 1988 election platform. He is credited with creating Reform's campaign slogan, "The West wants in!"[10]
Harper ran for the Canadian House of Commons in the 1988 federal election, appearing on the ballot as Steve Harper in Calgary West. He lost by a wide margin to Hawkes, his former employer. The Reform Party did not win any seats in this election, although party candidate Deborah Grey was elected as the party's first MP in a by-election shortly thereafter. Harper became Grey's executive assistant, and was her chief adviser and speechwriter until 1993.[11] He remained prominent in the Reform Party's national organization in his role as policy chief, encouraging the party to expand beyond its Western base and arguing that strictly regional parties were at risk of being taken over by radical elements.[12] He delivered a speech at the Reform Party's 1991 national convention, in which he condemned extremist views.[13]
Harper's relationship with Manning became strained in 1992, due to conflicting strategies over the Charlottetown Accord. Harper opposed the Accord on principle for ideological reasons, while Manning was initially more open to compromise. Harper also criticized Manning's decision to hire Rick Anderson as an adviser, believing that Anderson was not sufficiently committed to the Reform Party's principles.[14] He resigned as policy chief in October 1992.
Harper stood for office again in the 1993 federal election, and defeated Jim Hawkes amid a significant Reform breakthrough in Western Canada. His campaign likely benefited from a $50,000 print and television campaign organized by the National Citizens Coalition against Hawkes, although the NCC did not endorse Harper directly.[15]
Harper emerged a prominent member of the Reform Party of Canada caucus. He was active on constitutional issues during his first term in Parliament, and played a prominent role in drafting the Reform Party's strategy for the 1995 Quebec referendum. A long-standing opponent of centralized federalism, he stood with Preston Manning in Montreal to introduce a twenty-point plan to "decentralize and modernize" Canada in the event of a "no" victory.[16] Harper later argued that the "no" side's narrow plurality was a worst-case scenario, in that no-one had won a mandate for change.[17]
Harper has expressed socially conservative views on some issues.[18] In 1994, he opposed plans by federal Justice Minister Allan Rock to introduce spousal benefits for same-sex couples. Citing the recent failure of a similar initiative in Ontario, he was quoted as saying, "What I hope they learn is not to get into it. There are more important social and economic issues, not to mention the unity question."[19] Harper also spoke against the possibility of the Canadian Human Rights Commission or the Supreme Court changing federal policy in these and other matters.[20]
At the Reform Party's 1994 policy convention, Harper was part of a small minority of delegates who voted against restricting the definition of marriage to "the union of one man and one woman".[21] He actually opposed both same-sex marriage and mandated benefits for same-sex couples, but argued that political parties should refrain from taking official positions on these and other "issues of conscience".[22]
Harper was the only Reform MP to support the creation of the Canadian Firearms Registry at second reading in 1995, although he later voted against it at third reading stage. He said at the time that he initially voted for the registry because of a poll showing that most of his constituents supported it, and added that he changed his vote when a second poll showed the opposite result. Some accused him of manipulating the second poll to achieve the result he wanted.[23] It was reported in April 1995 that some Progressive Conservatives opposed to Jean Charest's leadership wanted to remove both Charest and Manning, and unite the Reform and Progressive Conservative parties under Harper's leadership.[24]
Despite his prominent position in the party, Harper's relationship with the Reform Party leadership was frequently strained. In early 1994, he criticized a party decision to establish a personal expense account for Preston Manning at a time when other Reform MPs had been asked to forego parliamentary perquisites.[25] He was formally rebuked by the Reform executive council despite winning support from some MPs. His relationship with Manning grew increasingly fractious in the mid-1990s, and he pointedly declined to express any opinion on Manning's leadership during a 1996 interview.[26] This friction was indicative of a fundamental divide between the two men: Harper was strongly committed to conservative principles and opposed Manning's inclinations toward populism, which Harper saw as leading to compromise on core ideological matters.[27][not in citation given]
These tensions culminated in late 1996 when Harper announced that he would not be a candidate in the next federal election. He resigned his parliamentary seat on January 14, 1997, the same day that he was appointed as a vice-president of the National Citizens Coalition (NCC), a conservative think-tank and advocacy group.[28] He was promoted to NCC president later in the year.
In April 1997, Harper suggested that the Reform Party was drifting toward social conservatism and ignoring the principles of economic conservatism.[29] The Liberal Party lost seats but managed to retain a narrow majority government in the 1997 federal election, while Reform made only modest gains.
Soon after leaving Parliament, Harper and Tom Flanagan co-authored an opinion piece entitled "Our Benign Dictatorship", which argued that the Liberal Party only retained power through a dysfunctional political system and a divided opposition. Harper and Flanagan argued that national conservative governments between 1917 and 1993 were founded on temporary alliances between Western populists and Quebec nationalists, and were unable to govern because of their fundamental contradictions. The authors called for an alliance of Canada's conservative parties, and suggested that meaningful political change might require electoral reforms such as proportional representation. "Our Benign Dictatorship" also commended Conrad Black's purchase of the Southam newspaper chain, arguing that his stewardship would provide for a "pluralistic" editorial view to counter the "monolithically liberal and feminist" approach of the previous management.[30]
Harper remained active in constitutional issues. He was a prominent opponent of the Calgary Declaration on national unity in late 1997, describing it as an "appeasement strategy" against Quebec nationalism. He called for federalist politicians to reject this strategy, and approach future constitutional talks from the position that "Quebec separatists are the problem and they need to be fixed".[31] In late 1999, Harper called for the federal government to establish clear rules for any future Quebec referendum on sovereignty.[32] Some have identified Harper's views as an influence on the Chrétien government's Clarity Act.[33]
As National Citizens Coalition (NCC) leader, Harper launched an ultimately unsuccessful legal battle against federal election laws restricting third-party advertising.[34] He led the NCC in several campaigns against the Canadian Wheat Board,[35] and supported Finance Minister Paul Martin's 2000 tax cuts as a positive first step toward tax reform.[36]
In 1997, Harper delivered a controversial speech on Canadian identity to the Council for National Policy, a conservative American think tank. He made comments such as "Canada is a Northern European welfare state in the worst sense of the term, and very proud of it", "if you're like all Americans, you know almost nothing except for your own country. Which makes you probably knowledgeable about one more country than most Canadians", and "the NDP [New Democratic Party] is kind of proof that the Devil lives and interferes in the affairs of men."[37] These statements were made public and criticized during the 2006 election. Harper argued that the speech was intended as humour, and not as serious analysis.[38]
Harper considered campaigning for the Progressive Conservative Party leadership in 1998, after Jean Charest left federal politics. Among those encouraging his candidacy were senior aides to Ontario Premier Mike Harris, including Tony Clement and Tom Long.[39] He eventually decided against running, arguing that it would "burn bridges to those Reformers with whom I worked for many years" and prevent an alliance of right-wing parties from taking shape.[40] Harper was skeptical about the Reform Party's United Alternative initiative in 1999, arguing that it would serve to consolidate Manning's hold on the party leadership.[41] He also expressed concern that the UA would dilute Reform's ideological focus.[42]
When the United Alternative created the Canadian Alliance in 2000 as a successor party to Reform, Harper predicted that Stockwell Day would defeat Preston Manning for the new party's leadership. He expressed reservations about Day's abilities, however, and accused Day of "[making] adherence to his social views a litmus test to determine whether you're in the party or not".[43] Harper endorsed Tom Long for the leadership, arguing that Long was best suited to take support from the Progressive Conservative Party.[44] When Day placed first on the first ballot, Harper said that the Canadian Alliance was shifting "more towards being a party of the religious right".[45]
After the death of Pierre Trudeau in 2000, Harper wrote an editorial criticizing Trudeau's policies as they affected Western Canada. He wrote that Trudeau "embraced the fashionable causes of his time, with variable enthusiasm and differing results", but "took a pass" on the issues that "truly defined his century".[46] Harper subsequently accused Trudeau of promoting "unabashed socialism", and argued that Canadian governments between 1972 and 2002 had restricted economic growth through "state corporatism".[47]
After the Canadian Alliance's poor showing in the 2000 election, Harper joined with other Western conservatives in co-authoring a document called the "Alberta Agenda". The letter called on Alberta to reform publicly-funded health care, replace the Canada Pension Plan with a provincial plan and replace the Royal Canadian Mounted Police with a provincial police force. It became known as the "firewall letter", because it called on the provincial government to "build firewalls around Alberta" in order to stop the federal government from redistributing its wealth to less affluent regions.[48] Alberta Premier Ralph Klein agreed with some of the letter's recommendations, but distanced himself from the "firewall" comments.[49]
Harper also wrote an editorial in late 2000 arguing that Alberta and the rest of Canada were "embark[ing] on divergent and potentially hostile paths to defining their country". He said that Alberta had chosen the "best of Canada's heritage—a combination of American enterprise and individualism with the British traditions of order and co-operation" while Canada "appears content to become a second-tier socialistic country [...] led by a second-world strongman appropriately suited for the task". He also called for a "stronger and much more autonomous Alberta", while rejecting calls for separatism.[50] In the 2001 Alberta provincial election, Harper led the NCC in a "Vote Anything but Liberal" campaign.[51] Some articles from this period described him as a possible successor to Klein.[52]
Harper and the NCC endorsed a private school tax credit proposed by Ontario's Progressive Conservative government in 2001, arguing that it would "save about $7,000 for each student who does not attend a union-run public school". Education Minister Janet Ecker criticized this, saying that her government's intent was not to save money at the expense of public education.[53]
Day's leadership of the Canadian Alliance became increasingly troubled throughout the summer of 2001, as several party MPs called for his resignation. In June, the National Post newspaper reported that former Reform MP Ian McClelland was organizing a possible leadership challenge on Harper's behalf.[54] Harper announced his resignation from the NCC presidency in August 2001, to prepare a campaign.[55]
Stockwell Day called a new Canadian Alliance leadership race for 2002, and soon declared himself a candidate. Harper emerged as Day's main rival, and declared his own candidacy on December 3, 2001. He eventually won the support of at least 28 Alliance MPs,[56] including Scott Reid, James Rajotte[57] and Keith Martin.[58] During the campaign, Harper reprised his earlier warnings against an alliance with Quebec nationalists, and called for his party to become the federalist option in Quebec.[59] He argued that "the French language is not imperilled in Quebec", and opposed "special status" for the province in the Canadian Constitution accordingly.[60] He also endorsed greater provincial autonomy on Medicare, and said that he would not co-operate with the Progressive Conservatives as long as they were led by Joe Clark.[61] On social issues, Harper argued for "parental rights" to use corporal punishment against their children and supported raising the age of sexual consent.[62] He described his potential support base as "similar to what George Bush tapped".[63]
The tone of the leadership contest turned hostile in February 2002. Harper described Day's governance of the party as "amateurish",[64] while his campaign team argued that Day was attempting to win re-election by building a narrow support base among different groups in the religious right.[65] The Day campaign accused Harper of "attacking ethnic and religious minorities".[66] In early March, the two candidates had an especially fractious debate on CBC Newsworld.[67] The leadership vote was held on March 20, 2002. Harper was elected on the first ballot with 55% support, against 37% for Day. Two other candidates split the remainder.
After winning the party leadership, Harper announced his intention to run for Parliament in a by-election in Calgary Southwest, recently vacated by Preston Manning. Ezra Levant had already been chosen as the riding's Alliance candidate and initially declared that he would not stand aside for Harper; he subsequently reconsidered.[68] The Liberals did not field a candidate, following a parliamentary tradition of allowing opposition leaders to enter the House of Commons unopposed. The Progressive Conservative candidate, Jim Prentice, also chose to withdraw.[69] Harper was elected without difficulty over New Democrat Bill Phipps, a former United Church of Canada moderator. Harper told a reporter during the campaign that he "despise[d]" Phipps, and declined to debate him.[70]
Harper officially became Leader of the Opposition in May 2002. Later in the same month, he said that the Atlantic Provinces were trapped in "a culture of defeat" which had to be overcome, the result of policies designed by Liberal and Progressive Conservative governments. Many Atlantic politicians condemned the remark as patronizing and insensitive. The Legislature of Nova Scotia unanimously approved a motion condemning Harper's comments,[71] which were also criticized by New Brunswick Premier Bernard Lord, federal Progressive Conservative leader Joe Clark and others. Harper refused to apologize, and said that much of Canada was trapped by the same "can't-do" attitude.[72]
His first 18 months as opposition leader were largely devoted towards consolidating the fractured elements of the Canadian Alliance and encouraging a union of the Canadian Alliance and the federal Progressive Conservatives[citation needed]. The aim of this union was to present only one right-of-center national party in the next federal election. In undertaking the merger talks, PC leader Peter MacKay reversed his previous agreement with leadership opponent David Orchard not to merge with the Alliance. After reaching an agreement with MacKay in October 2003, the Canadian Alliance and the Progressive Conservative Party of Canada officially merged in December, with the new party being named the "Conservative Party of Canada".[73]
In March 2003 Harper and Stockwell Day co-wrote a letter to The Wall Street Journal in which they condemned the Canadian government's unwillingness to participate in the 2003 invasion of Iraq.[74]
On January 12, 2004, Harper announced his resignation as Leader of the Opposition, in order to run for the leadership of the Conservative Party of Canada. Harper was elected the first leader of the Conservative Party, with a first ballot majority against Belinda Stronach and Tony Clement on March 20, 2004. Harper's victory included strong showings in Ontario, Quebec, and Atlantic Canada.
Harper led the Conservatives into the 2004 federal election. Initially, new Prime Minister Paul Martin held a large lead in polls, but this eroded due to infighting, Adscam and other scandals surrounding his government. The Liberals attempted to counter this with an early election call, as this would give the Conservatives less time to consolidate their merger.[citation needed]
Martin's weak performance in the leader's debate, along with an unpopular provincial budget by Liberal Premier Dalton McGuinty in Ontario, moved the Conservatives into a lead for a time. However, comments by Conservative MPs, leaked press releases slandering the then Prime Minister, as well as controversial TV attack ads suggesting that the Conservatives would make Canada more like the United States, caused Harper's party to lose some momentum.[citation needed]
Harper made an effort to appeal to voters in Quebec, a province where the Reform/Alliance side of the merged party had not done well. He was featured in several of the Tories' French-language campaign ads.[citation needed]
The Liberals were re-elected to power with a minority government, with the Conservatives coming in second place. The Conservatives managed to make inroads into the Liberals' Ontario stronghold, primarily in the province's socially conservative central region. However, they were shut out of Quebec, marking the first time that a centre-right party did not win any seats in that province. Harper, after some personal deliberation, decided to stay on as the party leader. Many credited him with bringing the Progressive Conservative Party and Canadian Alliance together in a short time to fight a close election.[citation needed]
Two months after the federal election, Stephen Harper privately met with Bloc Québécois leader Gilles Duceppe and New Democratic Party leader Jack Layton in a Montreal hotel.[75] On September 9, 2004, the three signed a letter addressed to then-Governor General Adrienne Clarkson, stating,
We respectfully point out that the opposition parties, who together constitute a majority in the House, have been in close consultation. We believe that, should a request for dissolution arise this should give you cause, as constitutional practice has determined, to consult the opposition leaders and consider all of your options before exercising your constitutional authority.[76][77]
On the same day the letter was written, the three party leaders held a joint press conference at which they expressed their intent to co-operate on changing parliamentary rules, and to request that the Governor General consult with them before deciding to call an election.[78] At the news conference, Harper said "It is the Parliament that's supposed to run the country, not just the largest party and the single leader of that party. That's a criticism I've had and that we've had and that most Canadians have had for a long, long time now so this is an opportunity to start to change that." However, at the time, Harper and the two other opposition leaders denied trying to form a coalition government.[75] Harper said, "This is not a coalition, but this is a co-operative effort."[78]
One month later, on October 4, Mike Duffy, now a Conservative senator (appointed by Harper), said "It is possible that you could change prime minister without having an election," and that some Conservatives wanted Harper to temporarily become prime minister without holding an election. The next day Layton walked out on talks with Harper and Duceppe, accusing them of trying to replace Paul Martin with Harper as prime minister. Both Bloc and Conservative officials denied Layton's accusations.[75]
On March 26, 2011, Duceppe stated that Harper had tried to form a coalition government with the Bloc and NDP in response to Harper's allegations that the Liberals may form a coalition with the Bloc and the NDP.[79]
The Conservative Party's first policy convention was held from March 17–19, 2005, in Montreal. Harper had been rumoured to be shifting his ideology closer to that of a Blue Tory, and many thought he'd wanted to move the party's policies closer to the centre. Any opposition to abortion or bilingualism was dropped from the Conservative platform. Harper received an 84% endorsement from delegates in the leadership review.
Despite the party's move to the centre, the party began a concerted drive against same-sex marriage. Harper was criticized by a group of law professors for arguing that the government could override the provincial court rulings on same-sex marriage without using the "notwithstanding clause", a provision of Canada's Charter of Rights and Freedoms. It also argued, in general, for lower taxes, an elected Senate, a tougher stance on crime, and closer relations with the United States.[citation needed]
Following the April 2005 release of Jean Brault's damaging testimony at the Gomery Commission, implicating the Liberals in the scandal, opinion polls placed the Conservatives ahead of Liberals. The Conservatives had earlier abstained from the vote on the 2005 budget to avoid forcing an election. With the collapse in Liberal support and a controversial NDP amendment to the budget, the party exerted significant pressure on Harper to bring down the government. In May, Harper announced that the government had lost the "moral authority to govern". Shortly thereafter, the Conservatives and Bloc Québécois united to defeat the government on a vote that some considered to be either a confidence motion or else a motion requiring an immediate test of the confidence of the House. The Martin government did not accept this interpretation and argued that vote had been on a procedural motion, although they also indicated that they would bring forward their revised budget for a confidence vote the following week. Ultimately, the effort to bring down the Government failed following the decision of Conservative MP Belinda Stronach to cross the floor to the Liberal Party. The vote on the NDP amendment to the budget tied, and with the Speaker of the House voting to continue debate, the Liberals stayed in power. At the time, some considered the matter to be a constitutional crisis.[80][81]
Harper was also criticized for supporting his caucus colleague MP Gurmant Grewal.[82] Grewal had produced tapes of conversations with Tim Murphy, Paul Martin's chief of staff, in which Grewal claimed he had been offered a cabinet position in exchange for his defection.
The Liberals' support dropped after the first report from the Gomery Commission was issued. On November 24, 2005, Harper introduced a motion of non-confidence on the Liberal government, telling the House of Commons "that this government has lost the confidence of the House of Commons and needs to be removed." As the Liberals had lost NDP support in the house by refusing to accept an NDP plan to prevent health care privatization, the no-confidence motion was passed by a vote of 171–133. It was the first time that a Canadian government had been toppled by a straight motion of non-confidence proposed by the opposition. As a result, Parliament was dissolved and a general election was scheduled for January 23, 2006.
On February 27, 2008, allegations surfaced that two Conservative Party officials offered terminally ill, Independent MP Chuck Cadman a million-dollar life insurance policy in exchange for his vote to bring down the Liberal government in a May 2005 budget vote.[83] If the story had been proved true, the actions may have been grounds for charges as a criminal offence since, under the Criminal Code of Canada, it is illegal to bribe an MP.[84]
When asked by Vancouver journalist Tom Zytaruk about the alleged life insurance offer then-opposition leader Stephen Harper states on an audio tape "I don't know the details. I know there were discussions"[85] and goes on to say "The offer to Chuck was that it was only to replace financial considerations he might lose due to an election".[85] Harper also stated that he had told the Conservative party representatives that they were unlikely to succeed. "I told them they were wasting their time. I said Chuck had made up his mind."[85][86] In February 2008 the Royal Canadian Mounted Police (RCMP) investigated the allegations that Section 119's provisions on bribery and corruption in the Criminal Code had been violated.[87][88] The RCMP concluded their investigation stating that there is no evidence for pressing charges.[89]
Harper denied any wrongdoing and subsequently filed a civil libel suit against the Liberal Party of Canada. Since libel laws do not apply for statements made in the House of Commons, the basis of the lawsuit was that statements made by Liberal party members outside the House and in articles which appeared on the Liberal party web site made accusations that Harper had committed a criminal act.[86][90]
The audio expert hired by Harper to prove that the tape containing the evidence was doctored reported that the latter part of the tape was recorded over, but the tape was unaltered where Harper's voice said "I don't know the details, I know that, um, there were discussions, um, but this is not for publication?" and goes on to say he "didn't know the details" when asked if he knew anything about the alleged offer to Cadman.[91]
The Conservatives began the campaign period with a policy-per-day strategy, contrary to the Liberal plan of holding off major announcements until after the Christmas holidays, so Harper dominated media coverage for the first weeks of the election. Though his party showed only modest movement in the polls, Harper's personal numbers, which had always significantly trailed those of his party, began to rise. In response, the Liberals launched negative ads targeting Harper, similar to their attacks in the 2004 election. However, their tactics were not sufficient to erode the Conservative's advantage, although they did manage to close what had been a ten point advantage in public opinion. As Harper's personal numbers rose, polls found he was now considered not only more trustworthy, but a better choice for Prime Minister than Martin.[92]
Immediately prior to the Christmas break, in a faxed letter to NDP candidate Judy Wasylycia-Leis, the Commissioner of the RCMP, Giuliano Zaccardelli announced the RCMP had opened a criminal investigation into her complaint that it appeared Liberal Finance Minister Ralph Goodale's office had leaked information leading to insider trading before making an important announcement on the taxation of income trusts. On December 27, 2005, the RCMP confirmed that information in a press release. At the conclusion of the investigation, Serge Nadeau, a top Finance Department bureaucrat, was charged with criminal breach of trust. No charges were laid against then Finance Minister Ralph Goodale.[93]
The election gave Harper's Conservatives the largest number of seats in the House, although not enough for a majority government, and shortly after midnight on January 24, Martin conceded defeat. Later that day, Martin informed Governor General Michaëlle Jean that he would resign as Prime Minister, and at 6:45 p.m. Jean asked Harper to form a government. Harper was sworn in as Canada's 22nd Prime Minister on February 6, 2006. In his first address to Parliament as head of government, Harper opened by paying tribute to the Queen and her "lifelong dedication to duty and self-sacrifice," referring to her specifically as Canada's head of state.[94] He also said before the Canada-UK Chamber of Commerce that Canada and the United Kingdom were joined by "the golden circle of the Crown, which links us all together with the majestic past that takes us back to the Tudors, the Plantagenets, the Magna Carta, habeas corpus, petition of rights, and English common law."[95] Journalist Graham Fraser said in the Toronto Star that Harper's speech was "one of the most monarchist speeches a Canadian prime minister has given since John Diefenbaker."[96] An analysis by Michael D. Behiels suggests a political realignment may be underway based on the continuance of Harper's government.[97]
On October 14, 2008, after a 5 week long campaign, the Conservative Party won a federal election and increased its number of seats in Parliament to 143, up from 127 at the dissolution of the previous Parliament; however, the actual popular vote among Canadians dropped slightly by 167,494 votes. As a result of the lowest voter turnout in Canadian electoral history, this represented only 22% of eligible Canadian voters, the lowest level of support of any winning party in Canadian history.[98] Meanwhile, the number of opposition Liberal MPs fell from 95 to 77 seats. It takes 155 MPs to form a majority government in Canada's 308 seat Parliament.
On December 4, 2008, Harper asked Governor General Michaëlle Jean to prorogue Parliament in order to avoid a vote of confidence scheduled for the following Monday, becoming the first Canadian PM ever to do so.[99][100] The request was granted by Jean, and the prorogation lasted until January 26, 2009. The opposition coalition dissolved shortly after, with the Conservatives winning a Liberal supported confidence vote on January 29, 2009.
On December 30, 2009, Harper announced that he would request the governor general prorogue Parliament again, effective immediately on December 30, 2009, during the 2010 Winter Olympics and lasting until March 3, 2010. Harper stated that this was necessary for Canada's economic plan. Jean would grant the request. In an interview with CBC News, Prince Edward Island Liberal member of Parliament Wayne Easter accused the Prime Minister of "shutting democracy down".[101][102] Tom Flanagan, Harper's University of Calgary mentor and former Chief of Staff, also questioned Harper's reasoning for prorogation, stating that "I think the government's talking points haven't been entirely credible" and that the government's explanation of proroguing was "skirting the real issue—which is the harm the opposition parties are trying to do to the Canadian Forces" regarding the Canadian Afghan detainee issue.[103] The second prorogation in a year received some international criticism as being undemocratic.[104] Demonstrations took place on January 23 in 64 Canadian cities and towns, and five cities in other countries.[105] A Facebook protest group attracted over 20,000 members.[106]
A poll released by Angus Reid on January 7, found that 53% of Canadians were opposed to the prorogation, while 19% supported it. 38% of Canadians believed that Harper used the prorogation to curtail the Afghan detainee inquiry, while 23% agreed with Harper's explanation that the prorogation was necessary economically.[107]
Harper filled five vacancies in the Senate of Canada with appointments of new Conservative senators, on January 29, 2010. The Senators filled vacancies in Quebec, Newfoundland and Labrador, and New Brunswick, as well as two vacancies in Ontario. The new senators were Pierre-Hugues Boisvenu, of Quebec, Bob Runciman, of Ontario, Vim Kochhar, of Ontario, Elizabeth Marshall of Newfoundland and Labrador and Rose-May Poirier, of New Brunswick. This changed the party standings in the Senate, which had been dominated by Liberals, to 51 Conservatives, 49 Liberals, and five others.[108]
Harper's Cabinet was defeated in a no-confidence vote on March 25, 2011, after being found in contempt of Parliament, thus triggering a general election.[109] This was the first occurrence in Commonwealth history of a government in the Westminster parliamentary tradition losing the confidence of the House of Commons on the grounds of contempt of Parliament. The no-confidence motion was carried with a vote of 156 in favor of the motion, and 145 against.[110]
On May 2, 2011, after a 5-week campaign, Harper led the Conservatives to their third consecutive election victory—the first time a centre-right party has accomplished this in half a century. The Conservatives increased their standing in Parliament to 166, up from 143 at the dissolution of the previous Parliament. This resulted in the first centre-right majority government since the Progressive Conservatives won what would be their last majority in 1988. The Conservative Party also received a greater number of total votes than in 2008. Aside from ending five years of minority governments, this election was notable for a number of firsts: bringing the New Democratic Party to official opposition status, the relegation of the Liberals to third place, the election of Canada's first Green Party Member of Parliament, and the decline of the Bloc Québécois (from 47 to 4 seats).
Unlike his recent predecessors, Harper did not name one of his colleagues to the largely honorific post of Deputy Prime Minister. Various observers had expected him to name MacKay, the former leader of the Progressive Conservative Party and his deputy party leader, or Lawrence Cannon, as a Quebec lieutenant, to the post. Harper did, however, name an order of succession to act on his behalf in certain circumstances, starting with Cannon, then Jim Prentice, then the balance of his cabinet in order of precedence.[citation needed]
Wikinews has related news: |
After sidestepping the political landmine for most of the first year of his time as prime minister, much as all the post-Charlottetown Accord prime ministers had done, Harper's hand was forced to reopen the Quebec sovereignty debate after the opposition Bloc Québécois were to introduce a motion in the House that called for recognition of Quebec as a "nation". On November 22, 2006, Harper introduced his own motion to recognize that "the Québécois form a nation within a united Canada."[111] Five days later, Harper's motion passed, with a margin of 266-16; all federalist parties, as well as the Bloc Québécois, were formally behind it.[112]
As of January 2010, the ruling Conservatives had raised the federal deficit back to $36 billion dollars. It is claimed by certain pundits that the Conservatives raised Canada's deficit to the largest in the country's history.[113][114] At the same time, Canada had the lowest Debt-to-GDP in the G7 economies.[115] The Economist magazine stated that Canada had come out the recession stronger than any other rich country in the G7.[116][117]
In 2004, Harper said "the Upper House remains a dumping ground for the favoured cronies of the prime minister."[118] During his term as prime minister from 2006 to 2008, Harper let Senate retirements go unfilled, resulting in 16 Senate vacancies by the October 2008 election.[119] The one exception to this policy was Michael Fortier. When Harper first took office, he directed the Governor General to appoint Michael Fortier to both the Senate and the Cabinet, arguing the government needed representation from the city of Montreal.[120] Although there is a precedent for this action in Canadian history, the appointment led to criticism from opponents who claimed Harper was reneging on his push for an elected Senate. In 2008 Fortier gave up his Senate seat and sought election as a Member of Parliament (MP), but was defeated by a large margin by the incumbent Bloc Québécois MP.[121]
After the October 2008 election, Harper again named Senate reform as a priority.[119] By December 2008, he recommended the appointment of 18 senators and in 2009 directed the Governor General to appoint an additional 9 senators. Many of those appointed had close ties with the Conservative Party, including the campaign manager of the Conservative Party, Doug Finley. Critics accused Harper of hypocrisy (the Liberals coined the term "Harpocrisy"). Conservative Senator Bert Brown defended Harper's appointments and said "the only way [the Senate]'s ever been filled is by having people that are loyal to the prime minister who's appointing them."[118]
Ahead of the Canada 2011 Census, the government announced that the long-form questionnaire (which collects detailed demographic information) will no longer be mandatory. According to Minister of Industry Tony Clement, the change was made because of privacy-related complaints and after consulting with Statistics Canada.[122] However, Canada's privacy commissioner reported only receiving three complaints between 1995 and 2010, according to a report in the Toronto Sun.[123]
Munir Sheikh, Canada's Chief Statistician appointed on Harper's advice,[124] resigned on July 21, 2010, in protest of the government's change in policy.[125] Ivan Fellegi, the former Chief Statistician of Canada, criticized the government's decision, saying that those who are most vulnerable (such as the poor, new immigrants, and aboriginals) are least likely to respond to a voluntary form, which weakens information about their demographic.[126]
The move was opposed by some governmental and non-governmental organizations.[127] Federation of Canadian Municipalities; City of Toronto;[128] Canadian Jewish Congress; Evangelical Fellowship of Canada;[129] Canadian Conference of Catholic Bishops;[130] Canadian Medical Association;[131] Statistical Society of Canada; the American Statistical Association;[132] and Registered Nurses Association of Ontario all opposed the change. However, the Fraser Institute supported the change.[133] The provincial governments of Ontario, Quebec, New Brunswick, Prince Edward Island, and Manitoba, also opposed the change.[134]
During his term, Harper has dealt with many foreign policy issues relating to the United States, War on Terror, Arab-Israeli conflict, free trade, China and Africa.
In 2009, Harper visited China. During the visit Chinese Premier Wen Jiabao publicly scolded Harper for not visiting earlier, pointing out that "this is the first meeting between the Chinese premier and a Canadian prime minister in almost five years";[135] Harper in response said that, "it's almost been five years since we had yourself or President Hu in our country.”[135] In 2008, former prime minister Jean Chrétien had criticized Harper for missing opening ceremonies for the 2008 Summer Olympics in Beijing;[136] in response, Dmitri Soudas, a spokeperson for Harper, called the remarks hypocritical, pointing out that Chrétien "attended one of six Olympic opening ceremonies during his 13 years as Prime Minister.[136]
On September 11, 2007, Harper visited Australia and addressed its Parliament.[137]
Michael Ignatieff criticized Harper for cutting foreign aid to Africa by $700 million, falling short of the UN Millennium Development Goals, and cutting eight African countries from the list of priority aid recipients.[138]
On March 11 and March 12, 2006, Harper made a surprise trip to Afghanistan, where Canadian Forces personnel have been deployed as part of the NATO-led International Security Assistance Force since late 2001, to visit troops in theatre as a show of support for their efforts, and as a demonstration of the government's commitment to reconstruction and stability in the region. Harper's choice of a first foreign visit was closely guarded from the press until his arrival in Afghanistan (citing security concerns), and is seen as marking a significant change in relationship between the government and the military. Harper returned to Afghanistan on May 22, 2007, in a surprise two-day visit which included visiting Canadian troops at the forward operating base at Ma'Sum Ghar, located 25 kilometres (16 mi) south of Kandahar, making Harper the first Prime Minister to have visited the front lines of a combat operation.[139]
At the outset of the 2006 Israel-Lebanon conflict, Harper defended Israel's "right to defend itself" and described its military campaign in Lebanon as a "measured" response, arguing that Hezbollah's release of kidnapped IDF soldiers would be the key to ending the conflict.[140] Speaking of the situation in both Lebanon and Gaza on July 18, Harper said he wanted "not just a ceasefire, but a resolution" but such a thing would not happen until Hezbollah and Hamas recognize Israel's right to exist. Harper blamed Hezbollah for all the civilian deaths. He asserted that Hezbollah's objective is to destroy Israel through violence.[141]
The media noted that Harper didn't allow reporters opportunities to ask him questions on his position. Some Canadians, including many Arab and Lebanese Canadians, criticized Harper's description of Israel's response.[142]
In December 2008, the Conference of Presidents of Major American Jewish Organizations recognized Harper's support for Israel with its inaugural International Leadership Award, pointing out Harper's decision to boycott the Durban II anti-racism conference, and his government's "support for Israel and [its] efforts at the U.N. against incitement and ... the delegitimization [of Israel]".[143]
In March 2009, Harper spoke at a Parliament Hill ceremony organized by Chabad-Lubavitch to honor the Jewish victims of the 2008 Mumbai attacks, which included an attack on the Nariman House. He expressed condolences over the murder at Chabad's Mumbai center of Rabbi Gavriel Holtzberg and his wife Rivka. Harper described the killings as "affronts to the values that unite all civilized people". Harper added that the quick installment of a new rabbi at the Chabad center in Mumbai as a signal that the Jewish people will "never bow to violence and hatred".[144]
In 2010, Canada lost a bid for a seat on the UN Security Council. While initially blaming the loss on his rival Ignatieff, Harper later said that it was due to his pro-Israeli stance. Harper then said that he would take a pro-Israeli stance, no matter what the political cost to Canada.[145][146][147] Ignatieff criticized Harper's stance as a "mistake", saying Canada would be better able to defend Israel through the Security Council than from the sidelines and pointed out that it is the Security Council that will determine if sanctions are imposed on Iran.[146] Ignatieff also accused Harper of steering the discussion away from implementing the two-state solution, and instead rendering all discussion into a competition "about who is Israel's best friend".[138]
On June 7, 2007, the Conservative government announced it had finalized free trade negotiations with the European Free Trade Association (EFTA). Under this agreement, Canada increased its trade ties with Iceland, Norway, Switzerland and Liechtenstein. In 2006, the value of trade between these partners was $10.7 billion. Canada had originally begun negotiations with the EFTA on October 9, 1998, but talks broke down due to a disagreement over subsidies to shipyards in Atlantic Canada.[148]
Shortly after being congratulated by George W. Bush for his victory, Harper rebuked U.S. Ambassador David Wilkins for criticizing the Conservatives' plans to assert Canada's sovereignty over the Arctic Ocean waters with armed forces.[149] Harper's first meeting as Prime Minister with the U.S. President occurred at the end of March 2006.
The government received American news coverage during the Democratic Party's 2008 presidential primaries after the details of a conversation between Barack Obama's economic advisor Austan Goolsbee, and Canadian diplomat Georges Rioux were revealed. Reportedly Goolsbee was reassuring the Canadians that Obama's comments on potentially renegotiating the North American Free Trade Agreement (NAFTA) were more political rhetoric than actual policy. The accuracy of these reports has been debated by both the Obama campaign and the Canadian Government. The news came at a key time nearing the Ohio and Texas primaries where, perceptions among Democratic voters is that the benefits of the NAFTA agreement are dubious. Thus the appearance that Obama was not being completely forthright was attacked by his opponent Hillary Clinton.[150] ABC News reported that Harper's Chief of Staff, Ian Brodie was responsible for the details reaching the hands of the media.[151] Harper has denied that Brodie was responsible for the leak, and launched an investigation to find the source. The Opposition, as well as Democratic strategist Bob Shrum,[152] criticized the Government on the issue, stating they were trying to help the Republicans by helping Hillary Clinton win the Democratic nomination instead of Obama. They also alleged the leak would hurt relations with the United States if Obama ever were to become President.[153] Obama was elected President in November. In February, Obama made his first foreign visit, as president, to Ottawa, in which he affirmed support for free trade with Canada, as well as complimenting Canada on its involvement in Afghanistan.[154]
Harper has insisted on his right to choose who asks questions at press conferences,[155] which has caused the national media to lodge complaints.[156] In 2007, Harper was awarded the Canadian Association of Journalists (CAJ) "Code of Silence Award" for his "white-knuckled death grip on public information". "If journalists can't get basic information from the federal government, Canadians can't hold the government accountable. The Prime Minister's Office has repeatedly demonstrated contempt for the public's right to know," [CAJ President] Welch said. "Harper pledged to run a government that was open, transparent and accountable, but his track record to-date has been abysmal."[157] Some have alleged that the Prime Minister's Office also "often informs the media about Harper's trips at such short notice that it's impossible for Ottawa journalists to attend the events".[158] Harper's director of communications has denied this, saying that "this prime minister has been more accessible, gives greater media scrums and provides deeper content than any prime minister has in the last 10 to 12 years". Some suggest that the Conservatives' then recent electoral success could be credited to their control of the campaign message, a practice that they continued when they became the government.[159]
The CAJ again criticized Harper's control over the media in an open letter in June 2010. The CAJ wrote "Politicians should not get to decide what information is released. This information belongs to Canadians, the taxpayers who paid for its production. Its release should be based on public interest, not political expediency. This breeds contempt and suspicion of government. How can people know the maternal-health initiative has been well thought out or that the monitoring of aboriginal bands has been done properly if all Canadians hear is: 'Trust us'?"[160]
Harper chose the following jurists to be appointed as justices of the Supreme Court of Canada by the governor general:
In keeping with Harper's election promise to change the appointment process, Rothstein's appointment involved a review by a parliamentary committee, following his nomination by the Prime Minister. Rothstein had already been short-listed, with two other candidates, by a committee convened by Paul Martin's previous Liberal government, and he was Harper's choice. Harper then had Rothstein appear before an 'ad hoc', non-partisan committee of 12 Members of Parliament. This committee was not empowered to block the appointment, though, as had been called for by some members of Harper's Conservative Party.[162]
On September 5, 2008, Harper nominated Justice Cromwell of Nova Scotia Court of Appeal to fill the Supreme Court seat left vacant by the departure of Justice Michel Bastarache. By and large Cromwell's nomination has been well received, with many lauding the selection,[163][164] however dissent has been noted surrounding the nomination. First, Harper bypassed Parliament's Supreme Court selection panel, which was supposed to produce a list of three candidates for him to choose from.[163] Second, Newfoundland Justice Minister Jerome Kennedy criticized the appointment, citing the Newfoundland government's belief that constitutional convention stipulates that a Newfoundlander should have been named to the Court in the rotation of Atlantic Canadian Supreme Court representation.[165]
Harper received the Woodrow Wilson Award on October 6, 2006, for his public service in Calgary. It was held at the Telus Convention Centre in Calgary, the same place where he made his victory speech.[166]
Time magazine named him as Canada's Newsmaker of the Year in 2006. Stephen Handelman wrote "that the prime minister who was once dismissed as a doctrinaire backroom tactician with no experience in government has emerged as a warrior in power".[167]
On June 27, 2008, Harper was awarded the Presidential Gold Medallion for Humanitarianism by B'nai B'rith International. He is the first Canadian to be awarded this medal.[168]
On July 11, 2011, Harper was honoured by Alberta's Blood tribe. He was made honorary Chief of the Blood Tribe during a ceremony, in which they recognized him for making an official apology on behalf of the Government of Canada for the residential schools abuse. Harper issued this apology in 2008. The Chief of the tribe explained that he believes the apology officially started the healing and rebuilding of relations between the federal and native councils. Lester B. Pearson, John Diefenbaker and Jean Chrétien are the only other Prime Ministers of Canada to be awarded the same honorary title.[169]
Harper married Laureen Teskey in 1993. Laureen was formerly married to New Zealander Neil Fenton from 1985 to 1988.[170] They have two children: Benjamin and Rachel. He is the third Prime Minister, after Pierre Trudeau and John Turner, to send his children to Rockcliffe Park Public School, in Ottawa. He is a member of the evangelical Christian and Missionary Alliance and attends church at the East Gate Alliance Church in Ottawa.[171] According to party literature, he is learning Spanish.[172]
An avid follower of ice hockey, he has been a fan of the Toronto Maple Leafs since his childhood in the Leaside and Etobicoke communities in Toronto. He is working on a book of the history of hockey, which he hopes to publish in 2012,[173] and writes articles occasionally on the subject.[174] Harper appeared on The Sports Network (TSN) during the broadcast of the Canada–Russia final of the 2007 World Junior Ice Hockey Championships. He was interviewed and expressed his views on the state of hockey, and his preference for an overtime period in lieu of a shoot-out.[175] In February 2010, Harper interviewed former National Hockey League greats Wayne Gretzky and Gordie Howe for a Saskatoon Kinsmen Club charity event.[176]
Harper taped a cameo appearance in an episode of the television show Corner Gas which aired March 12, 2007.[177] He reportedly owns a large vinyl record collection and is a fan of The Beatles and AC/DC.[178] In October 2009, he joined Yo-Yo Ma on stage in a National Arts Centre gala and performed "With a Little Help from My Friends". He was also accompanied by Herringbone, an Ottawa band with whom he regularly practises.[179] He received a standing ovation after providing the piano accompaniment and lead vocals for the song.[180]
In October 2010, Harper taped a cameo appearance in an episode of the television show Murdoch Mysteries, which aired July 20, 2011, during the show's fourth season.[181][182]
Harper is 6 feet 2 inches (188 cm) tall.[183] He is the first Prime Minister to employ a personal stylist, Michelle Muntean, whose duties range from co-ordinating his clothing to preparing his hair and makeup for speeches and television appearances. While formerly on public payroll, she has been paid for by the Conservative Party since "some time [in] 2007".[184]
Canadian federal election, 2011: Calgary Southwest [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | ±pp | Expenditures | |
Conservative | Stephen Harper | 42,998 | 75.12 | +2.22 | ||
New Democratic | Holly Heffernan | 6,823 | 11.92 | +4.12 | ||
Liberal | Marlene Lamontagne | 4,121 | 7.20 | -2.11 | ||
Green | Kelly Christie | 2,991 | 5.23 | -3.72 | ||
Independent | Larry R. Heather | 303 | 0.53 | +0.05 | ||
Total valid votes | 57,236 | 100.00 | ||||
Total rejected ballots | 177 | 0.31 | – | |||
Turnout | 57,413 | 60.95 | – | |||
Eligible voters | 94,192 | – |
Canadian federal election, 2008: Calgary Southwest [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | Expenditures | ||
Conservative | (x)Stephen Harper | 38,548 | 72.7 | |||
Liberal | Marlene Lamontagne | 4,919 | 9.29 | |||
Green | Kelly Christie | 4,732 | 8.95 | |||
New Democratic Party | Holly Heffernan | 4,102 | 7.74 | |||
Libertarian | Dennis Young | 265 | 0.50 | |||
Christian Heritage | Larry R. Heather | 256 | 0.48 | |||
Total valid votes | 52,832 | 100.00 | ||||
Total rejected ballots | 164 | |||||
Turnout | 52,996 | |||||
Canadian federal election, 2006: Calgary Southwest [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | Expenditures | ||
Conservative | (x)Stephen Harper | 41,549 | 72.36 | |||
Liberal | Mike Swanson | 6,553 | 11.41 | |||
New Democratic Party | Holly Heffernan | 4,628 | 8.06 | |||
Green | Kim Warnke | 4,407 | 7.68 | |||
Christian Heritage | Larry R. Heather | 279 | 0.49 | |||
Total valid votes | 57,416 | 100.00 | ||||
Total rejected ballots | 120 | |||||
Turnout | 57,536 | |||||
Sources: Official Results, Elections Canada and Financial Returns, Elections Canada. |
Canadian federal election, 2004: Calgary Southwest [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | Expenditures | ||
Conservative | (x)Stephen Harper | 35,297 | 68.36 | $62,952.76 | ||
Liberal | Avalon Roberts | 9,501 | 18.40 | $43,846.23 | ||
Green | Darcy Kraus | 3,210 | 6.22 | $534.96 | ||
New Democratic Party | Daria Fox | 2,884 | 5.59 | $3,648.70 | ||
Marijuana | Mark de Pelham | 516 | 1.00 | $0.00 | ||
Christian Heritage | Larry R. Heather | 229 | 0.44 | $985.59 | ||
Total valid votes | 51,637 | 100.00 | ||||
Total rejected ballots | 149 | |||||
Turnout | 51,786 | 64.49 | ||||
Electors on the lists | 80,296 | |||||
Percentage change figures are factored for redistribution. Conservative Party percentages are contrasted with the combined Canadian Alliance and Progressive Conservative percentages from 2000. Sources: Official Results, Elections Canada and Financial Returns, Elections Canada. |
Canadian federal by-election, May 13, 2002: Calgary Southwest [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | Expenditures | ||
Alliance | Stephen Harper | 13,200 | 71.66 | $58,959.16 | ||
New Democratic | Bill Phipps | 3,813 | 20.70 | $34,789.77 | ||
Green | James S. Kohut | 660 | 3.58 | $2,750.80 | ||
Independent | Gordon Barrett | 428 | 2.32 | $3,329.34 | ||
Christian Heritage | Ron Gray | 320 | 1.74 | $27,772.78 | ||
Total valid votes | 18,421 | 100.00 | ||||
Total rejected ballots | 98 | |||||
Turnout | 18,519 | 23.05 | ||||
Electors on the lists | 80,360 |
Canadian federal election, 1993: Calgary West [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | |||
Reform | Stephen Harper | 30,209 | 52.25 | |||
Liberal | Karen Gainer | 15,314 | 26.49 | |||
Progressive Conservative | (x) James Hawkes | 9,090 | 15.72 | |||
New Democratic | Rudy Rogers | 1,194 | 2.06 | |||
National | Kathleen McNeil | 1,068 | 1.85 | |||
Natural Law | Frank Haika | 483 | 0.84 | |||
Green | Don Francis | 347 | 0.60 | |||
Christian Heritage | Larry R. Heather | 116 | 0.20 | |||
Total valid votes | 57,821 | 100.00 | ||||
Total rejected ballots | 133 | |||||
Turnout | 57,954 | 66.29 | ||||
Electors on the lists | 87,421 | |||||
Source: Thirty-fifth General Election, 1993: Official Voting Results, Published by the Chief Electoral Officer of Canada. Financial figures taken from official contributions and expenses provided by Elections Canada. |
Canadian federal election, 1988: Calgary West [edit] | ||||||
---|---|---|---|---|---|---|
Party | Candidate | Votes | % | |||
Progressive Conservative | (x) James Hawkes | 32,025 | 58.52 | |||
Reform | Steve Harper | 9,074 | 16.58 | |||
Liberal | John Phillips | 6,880 | 12.57 | |||
New Democratic | Richard D. Vanderberg | 6,355 | 11.61 | |||
Libertarian | David Faren | 225 | 0.41 | |||
Confederation of Regions | Brent Morin | 170 | 0.31 | |||
Total valid votes | 54,729 | 100.00 | ||||
Total rejected ballots | 117 | |||||
Turnout | 54,846 | 78.75 | ||||
Electors on the lists | 69,650 |
All electoral information is taken from Elections Canada. Italicized expenditures refer to submitted totals, and are presented when the final reviewed totals are not available.
Wikinews has news related to: |
Wikisource has original text related to this article: |
Wikimedia Commons has media related to: Stephen Harper |
|
Persondata | |
---|---|
Name | Harper, Stephen |
Alternative names | |
Short description | 22nd Prime Minister of Canada; Canadian economist and politician |
Date of birth | April 30, 1959 |
Place of birth | Toronto, Ontario, Canada |
Date of death | |
Place of death |
First State | |
---|---|
Origin | Dordrecht, South Holland, Netherlands, Johannesburg, South Africa |
Genres | Trance, Progressive House |
Occupations | DJ, Producer, Remixer |
Years active | 2003 - present |
Labels | Magik Muzik, First State Music, Black Hole Recordings |
Website | www.firststatemusic.com |
Members | |
Sander van Dien , Shane Halcon | |
Past members | |
Ralph Barendse |
First State is a project of Sander van Dien and Shane Halcon. First State has released numerous singles, remixes and albums and compilations, most of which gained recognition within the world wide Trance scene. Ralphie B, left the group in 2009 and Shane Halcon joined soon after.
In 2010, the record label First State Music was announced together with First State Deep and First State House. Three new partner labels of Black Hole Recordings. First State Music is owned and managed by Sander van Dien and Shane Halcon.
Contents |
Aliases
Co-productions
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Total population | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
British 65,600,000 British diaspora est 140,000,000[1] |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Regions with significant populations | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
United Kingdom 62,436,000[2] (British citizens of any race or ethnicity) |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Languages | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Religion | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Protestantism, Anglicanism, Roman Catholicism |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Footnotes | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1. People who identify of full or partial British ancestry born in to that country. 2. British-born people who identify of British ancestry only. |
British people (also referred to as the British, Britons, or informally as Brits or Britishers) are citizens or natives of the United Kingdom, of the Isle of Man, of any of the Channel Islands, or of any of the British overseas territories, and their descendants.[23][24][25] British nationality law governs modern British citizenship and nationality, which can be acquired, for instance, by birth in the UK or by descent from British nationals. When used in a historical context, the term British people refers to the ancient Britons, the indigenous inhabitants of Great Britain south of the Forth.[24]
Although early assertions of being British date from the Late Middle Ages, the creation of the Kingdom of Great Britain[26][27][28][29][30] in 1707 triggered a sense of British national identity.[31] The notion of Britishness was forged during the Napoleonic Wars between Britain and the First French Empire, and developed further during the Victorian era.[31][32] The complex history of the formation of the United Kingdom created a "particular sense of nationhood and belonging" in Great Britain;[31] Britishness became "superimposed on much older identities", of English, Scots and Welsh cultures, whose distinctiveness still resist notions of a homogenised British identity.[33] Because of longstanding ethno-sectarian divisions, British identity in Northern Ireland is controversial, but it is held with strong conviction by unionists.[34]
Contemporary Britons are descended mainly from the varied ethnic stocks that settled in Great Britain before the eleventh century. Prehistoric, Celtic, Roman, Anglo-Saxon, and Norse influences were blended in Britain under the Normans, Scandinavian Vikings from northern France.[35] Conquest and union facilitated migration, cultural and linguistic exchange, and intermarriage between the peoples of England, Scotland and Wales during the Middle Ages, Early Modern period and beyond.[36][37] Since 1922, there has been immigration to the United Kingdom by people from what is now the Republic of Ireland, the Commonwealth, mainland Europe and elsewhere; they and their descendants are mostly British citizens with some assuming a British, dual or hyphenated identity.[38]
The British are a diverse, multi-national[39][40] and multicultural society, with "strong regional accents, expressions and identities".[41][42] The social structure of Britain has changed radically since the nineteenth century, with the decline in religious observance, enlargement of the middle class, and increased ethnic diversity. The population of the United Kingdom stands at around 62.5 million,[2] with a British diaspora of around 140 million concentrated in Australia, Canada, New Zealand and the United States.[43]
Contents |
Greek and Roman writers between the 1st century BC and the 1st century AD name the inhabitants of Great Britain and Ireland as the Priteni,[44] the origin of the Latin word Britannic. Parthenius, a 1st century Ancient Greek grammarian, and the Etymologicum Genuinum, a 9th century lexical encyclopaedia, describe Bretannus (the Latinised form of the Ancient Greek Βρεττανός) as the Celtic national forefather of the Britons.[45] It has been suggested that this name derives from a Gaullish description translated as "people of the forms", referring to the custom of tattooing or painting their bodies with blue woad.[46]
By 50 BC Greek geographers were using equivalents of Prettanikē as a collective name for the British Isles.[47][48] However, with the Roman conquest of Britain the Latin term Britannia was used for the island of Great Britain, and later Roman occupied Britain south of Caledonia.[49][50] Following the Roman departure from Britain, the island of Great Britain was left open to invasion by pagan, seafaring warriors such as Saxons and Jutes who gained control in areas around the south east.[51]
In this post-Roman period, as the Anglo-Saxons advanced, territory controlled by the Britons became confined to what would later be Wales, Cornwall and North West England.[52] However, the term Britannia persisted as the Latin name for the island. The Historia Brittonum claimed legendary origins as a prestigious genealogy for Brittonic kings, followed by the Historia Regum Britanniae which popularised this pseudo-history to support the claims of the Kings of England.[53]
During the Middle Ages, and particularly in the Tudor period, the term British was applied to the Welsh people. At this time, it was "the long held belief that the Welsh were descendants of the ancient Britons and that they spoke 'the British tongue'".[53] This notion was supported by texts such as the Historia Regum Britanniae, a pseudohistorical account of ancient British history, written in the mid-12th century by Geoffrey of Monmouth.[53] The Historia Regum Britanniae chronicled the lives of legendary kings of the Britons in a narrative spanning a time of two thousand years, beginning with the Trojans founding the ancient British nation and continuing until the Anglo-Saxon invasion of Britain in the 7th century forced the Celtic Britons to the west coast, namely Wales and Cornwall.[53] This legendary Celtic history of Great Britain is known as the Matter of Britain. The Matter of Britain, a national myth, was retold or reinterpreted in works by Gerald of Wales, a Cambro-Norman chronicler who in the 12th and 13th centuries used the term British to refer to what were later known as the Welsh.[54]
Traditional accounts of the ancestral roots of the British have taught that they are descended from diverse populations: the Scots, Welsh, Cornish and Irish from the Celts,[55][56][57][58][59][60][61] and the English from the Anglo-Saxons, who invaded from northern Europe and drove the Celts to Great Britain's western and northern fringes;[52][62] each are also thought to have a small portion of Viking heritage.[63] However, geneticist Stephen Oppenheimer suggests that DNA analysis attests that three quarters of Britons share a common ancestry with the hunter-gatherers who settled in Atlantic Europe during the Paleolithic era,[62][63][64] "after the melting of the ice caps but before the land broke away from the mainland and divided into islands".[63]
Despite the separation of the British Isles from continental Europe as a consequence of the last ice age, the genetic record indicates the British and Irish broadly share a closest common ancestry with the Basque people who live in the Basque Country by the Pyrenees.[62][63] Oppenheimer continues that the majority of the people of the British Isles share genetic commonalities with the Basques, ranging from highs of 90% in Wales to lows of 66% in East Anglia.
The difference between western Britain and the East of England is thought to have its origins to two divergent prehistoric routes of immigration — one up the Atlantic coast, the other from continental Europe.[63] Major immigrant settlement of the British Isles occurred during the Neolithic period,[63] interpreted by Bryan Sykes—professor of human genetics at the University of Oxford—as the arrival of the Celts from the Iberian Peninsula, and the origin of Britain's and Ireland's Celtic tribes.[65]
Oppenheimer's opinion is that "..by far the majority of male gene types in the British Isles derive from Iberia (modern Spain and Portugal), ranging from a low of 59% in Fakenham, Norfolk to highs of 96% in Llangefni, north Wales".[66] The National Museum Wales state that "it is possible that future genetic studies of ancient and modern human DNA may help to inform our understanding of the subject" but "early studies have, so far, tended to produce implausible conclusions from very small numbers of people and using outdated assumptions about linguistics and archaeology."[67]
Between the 8th and 11th centuries, "three major cultural divisions" had emerged in Britain; the English, Scottish and Welsh.[68] The English had unified under a single nation state in 937 by King Athelstan of Wessex after the Battle of Brunanburh.[69] Before then, the English (known then in Old English as the Anglecynn) were under the governance of independent Anglo-Saxon petty kingdoms which gradually coalesced into a Heptarchy of seven powerful states, the most powerful of which were Mercia and Wessex. Scottish historian and archaeologist Neil Oliver said that the Battle of Brunanburh would "define the shape of Britain into the modern era", it was a "showdown for two very different ethnic identities – a Norse Celtic alliance versus Anglo Saxon. It aimed to settle once and for all whether Britain would be controlled by a single imperial power or remain several separate independent kingdoms, a split in perceptions which is still very much with us today".[70] However, historian Simon Schama suggested that it was King Edward I of England who was solely "responsible for provoking the peoples of Britain into an awareness of their nationhood" in the 13th century.[71] Scottish national identity, "a complex amalgam" of Gael, Pict, Norsemen and Anglo-Norman, was not finally forged until the Wars of Scottish Independence against the Kingdom of England in the late 13th and early 14th centuries.[72][73]
Though Wales was conquered by England, and its legal system annexed into that of the Kingdom of England by the Laws in Wales Acts 1535–1542, the Welsh people endured as a nation distinct from that of the English people.[74] Later, with both an English Reformation and a Scottish Reformation, Edward VI of England under the council of Edward Seymour, 1st Duke of Somerset, advocated the Kingdom of Scotland joining England and Wales in a united Protestant Britain.[75] The Duke of Somerset supported the unification of the English, Welsh and Scottish people under the "indifferent old name of Britons" on the basis that their monarchies "both derived from a Pre-Roman British monarchy".[75]
Following the death of Queen Elizabeth I of England in 1603, the throne of England was inherited by James VI of Scotland which resulted in the Kingdom of England and the Kingdom of Scotland being united by a personal union under King James I of England and VI of Scotland; an event referred to as the Union of the Crowns.[76] King James advocated full political union between England and Scotland,[77] and on 20 October 1604 proclaimed his assumption of the style "King of Great Britain" though this title was rejected by both the Parliament of England and the Parliament of Scotland[78][79] and so had no basis in either English law or Scots law.
Despite centuries of military and religious conflict, the Kingdoms of England and Scotland had been "drawing increasingly together" since the Protestant Reformation of the 16th century and the Union of the Crowns in 1603.[81] A broadly shared language, island, monarch, religion and Bible (the Authorized King James Version) further contributed to a growing cultural alliance between the two sovereign realms and their peoples.[81][82] The Glorious Revolution of 1688 resulted in a pair of Acts of Parliament by the English and Scottish legislatures—the Bill of Rights 1689 and Claim of Right Act 1689 respectively—which ensured that the shared constitutional monarchy of England and Scotland was held only by Protestants. Despite this, although popular with the monarchy and much of the aristocracy, attempts to unite the two states by Acts of Parliament, in 1606, 1667, and 1689 were unsuccessful;[82] increased political management of Scottish affairs from England had led to "criticism", and strained Anglo-Scottish relations.[83][84]
While English maritime explorations during the Age of Discovery provided new found imperial power and wealth for the English and Welsh at the end of the 17th century, Scotland suffered from a long-standing weak economy.[83] In response, the Scottish kingdom, in opposition to King William II of Scotland and III of England, commenced the Darien Scheme, an attempt to establish a Scottish imperial outlet—the colony of New Caledonia—on the isthmus of Panama.[83] However, through a combination of Scottish mismanagement and English sabotage,[83][85] this imperial venture ended in "catastrophic failure" with an estimated "25% of Scotland's total liquid capital" lost.[83]
The events of the Darien Scheme coupled with the English Parliament passing the Act of Settlement 1701 asserting the right to choose the order of succession for English, Scottish and Irish thrones escalated political hostilities between England and Scotland, and neutralised calls for a united British people. The Parliament of Scotland responded by passing the Act of Security 1704, allowing it to appoint a different monarch to succeed to the Scottish crown from that of England, if it so wished.[83] The English political perspective was that the appointment of a Jacobite monarchy in Scotland opened up the possibility of a Franco-Scottish military conquest of England during the Second Hundred Years' War and War of the Spanish Succession.[83] The Alien Act 1705 was passed by the Parliament of England which provided that Scottish nationals in England were to be treated as aliens and estates held by Scots would be treated as alien property,[86] whilst also restricting the import of Scottish products into England and its colonies (about half of Scotland's trade).[87] However, the act contained a provision that it would be suspended if the Parliament of Scotland entered into negotiations regarding the creation of a unified Parliament of Great Britain, which in turn would refund Scottish financial losses on the Darien Scheme.[85]
Despite opposition from much of the Scottish,[83] and English populations,[88] a Treaty of Union was agreed in 1706 that was then ratified by each parliament passing Acts of Union 1707. With effect from 1 May 1707, this created a new sovereign state called Great Britain.[89][90][91] This kingdom "began as a hostile merger", but led to a "full partnership in the most powerful going concern in the world"; historian Simon Schama stated "it was one of the most astonishing transformations in European history."[92]
After 1707, a British national identity began to develop though initially resisted—particularly by the English[88]—the peoples of Great Britain had by the 1750s begun to assume a "layered identity", to think of themselves as simultaneously British and also Scottish, English, or Welsh.[88]
The terms North Briton and South Briton were devised for the Scottish and English, with the former gaining some preference in Scotland, particularly by the economists and philosophers of the Scottish Enlightenment.[93][94] Indeed, it was the "Scots [who] played key roles in shaping the contours of British identity";[95] "their scepticism about the Union allowed the Scots the space and time in which to dominate the construction of Britishness in its early crucial years",[96] drawing upon the notion of a shared "spirit of liberty common to both Saxon and Celt ... against the usurpation of the Church of Rome".[97] James Thomson was a poet and playwright born to a Church of Scotland minister in the Scottish Lowlands in 1700 who was interested in forging a common British culture and national identity in this way.[97] In collaboration with Thomas Arne, they wrote Alfred, an opera about Alfred the Great's victory against the Vikings performed to Frederick, Prince of Wales in 1740 to commemorate the accession of King George I of Great Britain and the birthday of Princess Augusta.[98] "Rule, Britannia!" was the climatic piece of the opera and quickly became a "jingoistic" British patriotic song celebrating "Britain's supremacy offshore".[99] An island country with a series of victories for the Royal Navy associated empire and naval warfare "inextricably with ideals of Britishness and Britain's place in the world".[100][101]
Britannia, the new national personification of Great Britain, was established in the 1750s as a representation of "nation and empire rather than any single national hero".[102] On Britannia and British identity, historian Peter Borsay wrote:
Up until 1797 Britannia was conventionally depicted holding a spear, but as a consequence of the increasingly prominent role of the Royal Navy in the war against the French, and of several spectacular victories, the spear was replaced by a trident... The navy had come to be seen...as the very bulwark of British liberty and the essence of what it was to be British.[103]
From the Union of 1707 through to the Battle of Waterloo in 1815, Great Britain was "involved in successive, very dangerous wars with Catholic France",[104] but which "all brought enough military and naval victories ... to flatter British pride".[105] As the Napoleonic Wars with the First French Empire advanced, "the English and Scottish learned to define themselves as similar primarily by virtue of not being French or Catholic".[106] In combination with sea power and empire, the notion of Britishness became more "closely bound up with Protestantism",[107] a cultural commonality through which the English, Scots and Welsh became "fused together, and remain[ed] so, despite their many cultural divergences".[108]
The proliferation of neo-classical monuments at the end of the 18th century and start of the 19th, such as The Kymin at Monmouth, were attempts to meld the concepts of Britishness with the Greco-Roman empires of classical antiquity.[103] The new and expanding British Empire provided "unprecedented opportunities for upward mobility and the accumulations of wealth", and so the "Scottish, Welsh and Irish populations were prepared to suppress nationalist issues on pragmatic grounds".[109] The British Empire was "crucial to the idea of a British identity and to the self-image of Britishness".[110] Indeed, the Scottish welcomed Britishness during the 19th century "for it offered a context within which they could hold on to their own identity whilst participating in, and benefiting from, the expansion of the [British] Empire".[111] Similarly, the "new emphasis of Britishness was broadly welcomed by the Welsh who considered themselves to be the lineal descendants of the ancient Britons – a word that was still used to refer exclusively to the Welsh".[111] For the English however, by the Victorian era their enthusiastic adoption of Britishness meant that for them it "meant the same as 'Englishness'",[112][113] so much so that "Englishness and Britishness" and "'England' and 'Britain' were used interchangably in a variety of contexts".[114] Britishness came to borrow heavily upon English political history because England had "always been the dominant component of the British Isles in terms of size, population and power"; Magna Carta, common law and hostility to continental Europe were English factors that influenced British sensibilities.[115][116]
The political union of the predominantly Catholic Kingdom of Ireland with Great Britain in 1800 coupled with outbreak of peace with France in the early 19th century, challenged the previous century's concept of militant Protestant Britishness.[117][118] The new, expanded United Kingdom of Great Britain and Ireland meant that the state had to re-evaulate its position on the civil rights of Catholics, and extend its definition of Britishness to the Irish people.[118][119] Like terms that had been invented around the of the Acts of Union 1707, West Briton was introduced for the Irish after 1800. In 1832 Daniel O'Connell, an Irish politician who campaigned for Catholic Emancipation, stated in Britain's House of Commons:
The people of Ireland are ready to become a portion of the British Empire, provided they be made so in reality and not in name alone; they are ready to become a kind of West Briton if made so in benefits and justice; but if not, we are Irishmen again.[120]
Ireland, from 1801 to 1923, was marked by a succession of economic and political mismanagement and neglect, which marginalised the Irish,[119] and advanced Irish nationalism. In the forty years that followed the union, successive British governments grappled with the problems of governing a country which had, as Benjamin Disraeli put it in 1844, "a starving population, an absentee aristocracy, and an alien Church, and in addition the weakest executive in the world".[121] Although the vast majority of Unionists in Ireland proclaimed themselves "simultaneously Irish and British", even for them there was a strain upon the adoption of Britishness after the Great Famine.[122]
War continued to be a unifying factor for the people of Great Britain; British jingoism re-emerged during the Boer Wars in southern Africa.[123][124] The experience of military, political and economic power from the rise of the British Empire led to a very specific drive in artistic technique, taste and sensibility for Britishness.[125] In 1887, Frederic Harrison wrote:
Morally, we Britons plant the British flag on every peak and pass; and wherever the Union Jack floats there we place the cardinal British institutions—tea, tubs, sanitary appliances, lawn tennis, and churches.[114]
The Catholic Relief Act 1829 reflected a "marked change in attitudes" in Great Britain towards Catholics and Catholicism.[126] A "significant" example of this was the collaboration between Augustus Welby Pugin, an "ardent Roman Catholic" and son of a Frenchman, and Sir Charles Barry, "a confirmed Protestant", in redesigning the Palace of Westminster—"the building that most enshrines ... Britain's national and imperial pre-tensions".[126] Protestantism gave way to imperialism as the leading element of British national identity during the Victorian and Edwardian eras,[124] and as such, a series of Royal, imperial and national celebrations were introduced to the British people to assert imperial British culture and give themselves a sense of uniqueness, superiority and national consciousness.[118][124][127] Empire Day and jubilees of Queen Victoria of the United Kingdom were introduced to the British middle class,[124] but quickly "merged into a national 'tradition'".[128]
The First World War "reinforced the sense of Britishness" and patriotism in the early 20th century.[118][123] Through war service (including conscription in Great Britain), "the English, Welsh, Scots and Irish fought as British".[118] The aftermath of the war institutionalised British national commemoration through Remembrance Sunday and the Poppy Appeal.[118] The Second World War had a similar unifying effect upon the British people,[129] however, its outcome was to recondition Britishness on a basis of democratic values and its marked contrast to Europeanism.[129] Notions that the British "constituted an Island race, and that it stood for democracy were reinforced during the war and they were circulated in the country through Winston Churchill's speeches, history books and newspapers".[129]
At its international zenith, "Britishness joined peoples around the world in shared traditions and common loyalities that were strenuously maintained".[130] But following the two world wars, the British Empire experienced rapid decolonisation. The secession of the Irish Free State from the United Kingdom meant that Britishness had lost "its Irish dimension" in 1922,[129] and the shrinking empire supplanted by independence movements dwindled the appeal of British identity in the Commonwealth of Nations during the mid-20th century.[131] Since the mass immigration to the United Kingdom from the Commonwealth and elsewhere in the world, and the British Nationality Act 1948, "the expression and experience of cultural life in Britain has become fragmented and reshaped by the influences of gender, ethnicity, class and region".[132] Furthermore, the effect of the United Kingdom's membership of the European Economic Community in 1973 eroded the concept of Britishness as distinct from continental Europe.[133][134] As such, since the 1970s "there has been a sense of crisis about what it has meant to be British",[135] exacerbated by growing demands for greater political autonomy for Northern Ireland, Scotland, and Wales.[136]
The late-20th century saw major changes to the politics of the United Kingdom with the establishment of devolved national administrations for Northern Ireland, Scotland, and Wales following pre-legislative referendums.[137] Calls for greater autonomy for the four countries of the United Kingdom had existed since their original union with each other, but gathered pace in the 1960s and 1970s.[136] Devolution has led to "increasingly assertive Scottish, Welsh and Irish national identities",[138] resulting in more diverse cultural expressions of Britishness,[139] or else its outright rejection; Gwynfor Evans, a Welsh nationalist politician active in the late-20th century, rebuffed Britishness as "a political synonym for Englishness which extends English culture over the Scots, Welsh and the Irish".[140]
In 2004 Sir Bernard Crick, political theorist and democratic socialist tasked with developing the life in the United Kingdom test said:
Gordon Brown, Prime Minister of the United Kingdom, initiated a debate on British identity in 2006.[143] Brown's speech to the Fabian Society's Britishness Conference proposed that British values demand a new constitutional settlement and symbols to represent a modern patriotism, including a new youth community service scheme and a British Day to celebrate.[143] One of the central issues identified at the Fabian Society conference was how the English identity fits within the framework of a devolved United Kingdom.[143] An expression of Her Majesty's Government's initiative to promote Britishness was the inaugural Veterans' Day which was first held on 27 June 2006. As well as celebrating the achievements of armed forces veterans, Brown's speech at the first event for the celebration said:
Scots and people from the rest of the UK share the purpose—that Britain has something to say to the rest of the world about the values of freedom, democracy and the dignity of the people that you stand up for. So at a time when people can talk about football and devolution and money, it is important that we also remember the values that we share in common.[144]
British people - people with British citizenship or of British descent - have a significant presence in a number of countries other than the United Kingdom, and in particular in those with historic connections to the British Empire. After the Age of Discovery the British were one of the earliest and largest communities to emigrate out of Europe, and the British Empire's expansion during the first half of the 19th century triggered an "extraordinary dispersion of the British people", resulting in particular concentrations "in Australasia and North America".[43]
The British Empire was "built on waves of migration overseas by British people",[146] who left the United Kingdom and "reached across the globe and permanently affected population structures in three continents".[43] As a result of the British colonisation of the Americas, what became the United States was "easily the greatest single destination of emigrant British", but in Australia the British experienced a birth rate higher than "anything seen before" resulting in the displacement of indigenous Australians.[43]
In colonies such as Southern Rhodesia, British East Africa and Cape Colony, permanently resident British communities were established and whilst never more than a numerical minority these Britons "exercised a dominant influence" upon the culture and politics of those lands.[146] In Australia, Canada and New Zealand "people of British origin came to constitute the majority of the population" contributing to these states becoming integral to the Anglosphere.[146]
The United Kingdom Census 1861 estimated the size of the overseas British to be around 2.5 million, but concluded that most of these were "not conventional settlers" but rather "travellers, merchants, professionals, and military personnel".[43] By 1890, there were over 1.5 million further British-born people living in Australia, Canada, New Zealand and South Africa.[43] A 2006 publication from the Institute for Public Policy Research estimated 5.6 million Britons lived outside of the United Kingdom.[6][147]
From the beginning of Australia's colonial period until after the Second World War, people from the United Kingdom made up a large majority of people coming to Australia, meaning that many Australian-born people can trace their origins to Britain.[148] The colony of New South Wales, founded on 26 January 1788, was part of the eastern half of Australia claimed by the Kingdom of Great Britain in 1770, and initially settled by Britons through penal transportation. Together with another five largely self-governing Crown Colonies, the federation of Australia was achieved on 1 January 1901.
Its history of British dominance, meant that Australia was "grounded in British culture and political traditions that had been transported to the Australian colonies in the nineteenth century and become part of colonial culture and politics".[149] Australia maintains the Westminster system of Parliamentary Government and Queen Elizabeth II as Queen of Australia. Until 1987, the national status of Australian citizens was formally described as "British Subject: Citizen of Australia". Britons continue to make up a substantial proportion of immigrants.[148]
The people of the British overseas territories are British by citizenship, via origins or naturalisation. Along with aspects of common British identity, each of them has their own distinct identity shaped in the respective particular circumstances of political, economic, ethnic, social and cultural history. For instance, in the case of the Falkland Islanders, Lewis Clifton the Speaker of the Legislative Council of the Falkland Islands, explains:
British cultural, economic, social, political and educational values create a unique British-like, Falkland Islands. Yet Islanders feel distinctly different from their fellow citizens who reside in the United Kingdom. This might have something to do with geographical isolation or with living on a smaller island—perhaps akin to those British people not feeling European.[150]
In contrast, for the majority of the Gibraltarian people, who live in Gibraltar, there is an "insistence on their Britishness" which "carries excessive loyalty" to Britain.[151] The sovereignty of Gibraltar has been a point of contention in Spain–United Kingdom relations, but an overwhelming number of Gibraltarians embrace Britishness with strong conviction, in direct opposition to Spanish territorial claims.[151][152][153]
Canada traces its statehood to the French, English and Scottish expeditions of North America from the late-15th century. France ceded nearly all of New France in 1763 after the Seven Years' War, and so after the United States Declaration of Independence in 1776, Quebec and Nova Scotia formed "the nucleus of the colonies that constituted Britain's remaining stake on the North American continent".[154] British North America attracted the United Empire Loyalists, British people who migrated out of what they considered the "rebellious" United States, increasing the size of British communities in what was to become Canada.[154]
In 1867 there was a union of three colonies with British North America which together formed the Canadian Confederation, a federal dominion.[155][156][157] This began an accretion of additional provinces and territories and a process of increasing autonomy from the United Kingdom, highlighted by the Statute of Westminster 1931 and culminating in the Canada Act 1982, which severed the vestiges of legal dependence on the parliament of the United Kingdom. Nevertheless, it is recognised that there is a "continuing importance of Canada's long and close relationship with Britain";[158] large parts of Canada's modern population claim "British origins" and the cultural impact of the British upon Canada's institutions is profound.[159]
It was not until 1977 that the phrase "A Canadian citizen is a British subject" ceased to be used in Canadian passports.[160] The politics of Canada are strongly influenced by British political culture.[161][162] Although significant modifications have been made, Canada is governed by a democratic parliamentary framework comparable to the Westminster system, and retains Queen Elizabeth II as Queen of Canada and Head of State.[163][164] English is an official language used in Canada.[165]
The cultural legacy of the British in Chile is notable and has spread beyond the British Chilean community onto society at large. One custom taken from the British is afternoon tea, called onces by Chileans. Another interesting, although peculiar, legacy is the sheer amount of use of British first surname by Chileans.
Chile currently has the largest population descendants of British in Latin America. Over 700,000 Chileans may have British (English, Scottish and Welsh) origin, amounting to 4,5% of Chile's population.[10]
A long-term result of James Cook's voyage of 1768–71,[166] a significant number of New Zealanders are of British descent, whom a sense of Britishness has contributed to their identity.[167] As late as the 1950s, it was common for British New Zealanders to refer to themselves as British, such as when Prime Minister Keith Holyoake described Sir Edmund Hillary's successful ascent of Mount Everest as putting "the British race and New Zealand on top of the world".[168] New Zealand passports described nationals as "British Subject: Citizen of New Zealand" until 1974, when this was changed to "New Zealand citizen".[169]
In an interview with the New Zealand Listener in 2006, Don Brash, the then Leader of the Opposition, said:
British immigrants fit in here very well. My own ancestry is all British. New Zealand values are British values, derived from centuries of struggle since Magna Carta. Those things make New Zealand the society it is.[170]
The politics of New Zealand are strongly influenced by British political culture. Although significant modifications have been made, New Zealand is governed by a democratic parliamentary framework comparable to the Westminster system, and retains Queen Elizabeth II of the United Kingdom as the head of the monarchy of New Zealand.[171] English is the dominant official language used in New Zealand.[172]
Plantations of Ireland introduced large numbers of English, Scottish and Welsh people to Ireland throughout the Middle Ages and early modern period. The resulting Protestant Ascendancy, the aristocratic class of the Lordship of Ireland, broadly identified themselves as Anglo-Irish.[173] In the sixteenth and seventeenth centuries, Protestant British settlers subjugated Catholic, Gaelic inhabitants in the north of Ireland during the Plantation of Ulster and the Williamite War in Ireland; it was "an explicit attempt to control Ireland strategically by introducing ethnic and religious elements loyal to the British interest in Ireland".[174]
The Ulster Scots people are an ethnic group of British origin in Ireland, broadly descended from Lowland Scots who settled in large numbers in the Province of Ulster during the planned process of colonisations of Ireland which took place in the reign of King James I of England and VI of Scotland. Together with English and Welsh settlers, these Scots introduced Protestantism (particularly the Presbyterianism of the Church of Scotland) and the Ulster Scots and English languages to, mainly, northeastern Ireland. With the partition of Ireland and independence for what is now the Republic of Ireland some of these people found themselves no longer living within the United Kingdom.
Northern Ireland itself was, for many years, the site of a violent and bitter ethno-sectarian conflict—The Troubles—between those claiming to represent Irish nationalism, who are predominantly Roman Catholic, and those claiming to represent British unionism, who are predominantly Protestant.[175] Unionists want Northern Ireland to remain part of the United Kingdom,[176] while nationalists desire a united Ireland.[177][178] Since the signing of the Good Friday Agreement in 1998, most of the paramilitary groups involved in the Troubles have ceased their armed campaigns, and constitutionally, the people of Northern Ireland have been recognised as "all persons born in Northern Ireland and having, at the time of their birth, at least one parent who is a British citizen, an Irish citizen or is otherwise entitled to reside in Northern Ireland without any restriction on their period of residence".[179] The Good Friday Agreement guarantees the "recognition of the birthright of all the people of Northern Ireland to identify themselves and be accepted as Irish or British, or both, as they may so choose".[179] Nevertheless, community divisions are still strong, and the unique situation of Britons in Northern Ireland has produced a strong and distinctive British identity which at the extreme is linked with Ulster loyalism and the Orange Institution, but more commonly is "civic in nature", tied with the Protestant work ethic of an "industrious, assertive, self-reliant" people.[174][180]
An English presence in North America began with the Roanoke Colony and Colony of Virginia in the late-16th century, but the first successful English settlement was established in 1607, on the James River at Jamestown. By the 1610s an estimated 1,300 English people had travelled to North America, the "first of many millions from the British Isles".[181] In 1620 the Pilgrims established the English imperial venture of Plymouth Colony, beginning "a remarkable acceleration of permanent emigration from England" with over 60% of trans-Atlantic English migrants settling in the New England Colonies.[181] During the 17th century an estimated 350,000 English and Welsh migrants arrived in North America, which in the century after the Acts of Union 1707 was surpassed in rate and number by Scottish and Irish migrants.[182]
The British policy of salutary neglect for its North American colonies intended to minimise trade restrictions as a way of ensuring they stayed loyal to British interests.[183] This permitted the development of the American Dream, a cultural spirit distinct from that of its European founders.[183] The Thirteen Colonies of British America began an armed rebellion against British rule in 1775 when they rejected the right of the Parliament of Great Britain to govern them without representation; they proclaimed their independence in 1776, and subsequently constituted the first thirteen states of the United States of America, which became a sovereign state in 1781 with the ratification of the Articles of Confederation. The 1783 Treaty of Paris represented Great Britain's formal acknowledgement of the United States' sovereignty at the end of the American Revolutionary War.[184]
Nevertheless, longstanding cultural and historical ties have, in more modern times, resulted in the Special Relationship, a term used to describe the exceptionally close political, diplomatic and military co-operation of United Kingdom – United States relations.[185] Linda Colley, a professor of history at Princeton University and specialist in Britishness, suggested that because of their colonial influence on the United States, the British find Americans a "mysterious and paradoxical people, physically distant but culturally close, engagingly similar yet irritatingly different".[186]
Result from the expansion of the British Empire, British cultural influence can be observed in the language and culture of a geographically wide assortment of countries such as Canada, Australia, New Zealand, South Africa, India, Pakistan, the United States, and the British overseas territories. These states are sometimes collectively known as the Anglosphere.[187] As well as the British influence on its empire, the empire also influenced British culture, particularly British cuisine. Innovations and movements within the wider-culture of Europe have also changed the United Kingdom; Humanism, Protestantism, and representative democracy have developed from broader Western culture.
As a result of the history of the formation of the United Kingdom, the cultures of England, Scotland, Wales, and Northern Ireland are diverse and have varying degrees of overlap and distinctiveness.
Historically, British cuisine has meant "unfussy dishes made with quality local ingredients, matched with simple sauces to accentuate flavour, rather than disguise it".[189] It has been "vilified as unimaginative and heavy", and traditionally been limited in its international recognition to the full breakfast and the Christmas dinner.[190] This is despite British cuisine having absorbed the culinary influences of those who have settled in Britain, resulting in hybrid dishes such as the British Asian Chicken tikka masala, hailed by some as "Britain's true national dish".[191]
Celtic agriculture and animal breeding produced a wide variety of foodstuffs for Celts and Britons. The Anglo-Saxons developed meat and savoury herb stewing techniques before the practice became common in Europe. The Norman conquest of England introduced exotic spices into Britain in the Middle Ages.[190] The British Empire facilitated a knowledge of India's food tradition of "strong, penetrating spices and herbs".[190] Food rationing policies, imposed by the British government during wartime periods of the 20th century, are said to have been the stimulus for British cuisine's poor international reputation.[190]
British dishes include fish and chips, the Sunday roast, and bangers and mash. British cuisine has several national and regional varieties, including English, Scottish and Welsh cuisine, each of which has developed its own regional or local dishes, many of which are geographically indicated foods such as Cheshire cheese, the Yorkshire pudding, Arbroath Smokie, Cornish pasty and Welsh cakes.
The British are the second largest per capita tea consumers in the world, consuming an average of 2.1 kilograms (4.6 lb) per person each year.[192] British tea culture dates back to the 19th century, when India was part of the British Empire and British interests controlled tea production in the subcontinent.
There is no single British language though English is by far the main language spoken by British citizens, being spoken monolingually by more than 70% of the UK population. English is therefore the de facto official language of the United Kingdom.[193] However, under the European Charter for Regional or Minority Languages, the Welsh, Scottish Gaelic, Cornish, Irish, Ulster Scots and Scots (or Lowland Scots) languages are officially recognised as Regional or Minority languages by the UK Government.[194] As indigenous languages which continue to be spoken as a first language by native inhabitants, Welsh and Scottish Gaelic have a different legal status to other minority languages. In some parts of the UK, some of these languages are commonly spoken as a first language; in wider areas, their use in a bilingual context is sometimes supported and/or promoted by central and/or local government policy. For naturalisation purposes, a competence standard of English, Scottish Gaelic or Welsh is required to pass the life in the United Kingdom test.[195] However, English is used routinely, and although considered culturally important, Scottish Gaelic and Welsh are seldom used and are effectively restricted in practice to remote rural areas.[196]
Throughout the United Kingdom there are strong and distinctive spoken expressions and regional accents of English,[42] which are seen to be symptomatic of a locality's culture and identity.[197] An awareness and knowledge of accents in the United Kingdom can "place, within a few miles, the locality in which a man or woman has grown up".[196]
British literature is "one of the leading literatures in the world".[198] The overwhelming part is written in the English language, but there are also literatures written in Scots, Scottish Gaelic and Welsh languages amongst others.
Although cinema, theatre, dance and live music are popular, the favourite pastime of the British is watching television.[201] Public broadcast television in the United Kingdom began in 1936, with the launch of the BBC Television Service (now BBC One). In the United Kingdom and the Crown dependencies, one must have a television licence to legally receive any broadcast television service, from any source. This includes the commercial channels, cable and satellite transmissions, and the Internet. Revenue generated from the television licence is used to provide radio, television and Internet content for the British Broadcasting Corporation, and Welsh language television programmes for S4C. The BBC, the common abbreviation of the British Broadcasting Corporation,[202] is the world's largest broadcaster.[203] Unlike other broadcasters in the UK, it is a public service based, quasi-autonomous, statutory corporation run by the BBC Trust. Free-to-air terrestrial television channels available on a national basis are BBC One, BBC Two, ITV, Channel 4 (S4C in Wales), and Five.
100 Greatest British Television Programmes was a list compiled by the British Film Institute in 2000, chosen by a poll of industry professionals, to determine what were the greatest British television programmes of any genre ever to have been screened.[204] Topping the list was Fawlty Towers, a British sitcom set in a fictional Torquay hotel starring John Cleese.[204]
"British musical tradition is essentially vocal",[205] dominated by the music of England and Germanic culture,[206] most greatly influenced by hymns and Anglican church music.[207] However, the specific, traditional music of Wales and music of Scotland is distinct, and of the Celtic musical tradition.[208] In the United Kingdom, more people attend live music performances than football matches.[209] British rock was born in the mid-20th century out of the influence of rock and roll and rhythm and blues from the United States. Major early exports were The Beatles, The Rolling Stones, The Who and The Kinks.[210] Together with other bands from the United Kingdom, these constituted the British Invasion, a popularisation of British pop and rock music in the United States. Into the 1970s and 1980s there was a diversification of British musical genres; Progressive rock, Glam rock, Heavy Metal, New Wave, and 2 Tone.[210] Britpop is a subgenre of alternative rock that emerged from the British independent music scene of the early 1990s and was characterised by bands reviving British guitar pop music of the 1960s and 1970s. Leading exponents of Britpop were Blur, Oasis and Pulp.[211] Also popularised in the United Kingdom during the 1990s were several domestically produced varieties of electronic dance music; Acid house, UK hard house, Jungle, UK garage which in turn have influenced Grime and British hip hop in the 2000s.[211] The BRIT Awards are the British Phonographic Industry's annual awards for both international and British popular music.
Historically, Christianity "has been the most influential and important religion in Britain", and it remains the declared faith of the majority of the British people.[212] The influence of Christianity on British culture has been "widespread, extending beyond the spheres of prayer and worship. Churches and cathedrals make a significant contribution to the architectural landscape of the nation's cities and towns" whilst "many schools and hospitals were founded by men and women who were strongly influenced by Christian motives".[212] Throughout the United Kingdom, Easter and Christmas, the "two most important events in the Christian calendar", are recognised as public holidays.[212] Christianity remains the major religion of the population of the United Kingdom in the 21st century, followed by Islam, Hinduism, Sikhism and then Judaism in terms of number of adherents. The 2007 Tearfund Survey revealed 53% identified themselves as Christian which was similar to the 2004 British Social Attitudes Survey,[213][214] and to the United Kingdom Census 2001 in which 71.6% said that Christianity was their religion,[215] However, the Tearfund Survey showed only one in ten Britons attend church weekly.[216] Secularism was advanced in Britain during the Age of Enlightenment, and modern British organisations such as the British Humanist Association and the National Secular Society offer the opportunity for their members to "debate and explore the moral and philosophical issues in a non-religious setting".[212]
The Treaty of Union that led to the formation of the Kingdom of Great Britain ensured that there would be a protestant succession as well as a link between church and state that still remains. The Church of England (Anglican) is legally recognised as the established church, and so retains representation in the parliament of the United Kingdom through the Lords Spiritual, whilst the British monarch is a member of the church as well as its Supreme Governor.[217][218] The Church of England also retains the right to draft legislative measures (related to religious administration) through the General Synod that can then be passed into law by Parliament. The Roman Catholic Church in England and Wales is the second largest Christian church with around five million members, mainly in England.[219] There are also growing Orthodox, Evangelical and Pentecostal churches, with Pentecostal churches in England now third after the Church of England and the Roman Catholic Church in terms of church attendance.[220] Other large Christian groups include Methodists and Baptists.
The presbyterian Church of Scotland (known informally as The Kirk), is recognised as the national church of Scotland and not subject to state control. The British monarch is an ordinary member and is required to swear an oath to "defend the security" of the church upon his or her accession. The Roman Catholic Church in Scotland is Scotland's second largest Christian church, with followers representing a sixth of the population of Scotland.[221] The Scottish Episcopal Church, which is part of the Anglican Communion, dates from the final establishment of Presbyterianism in Scotland in 1690, when it split from the Church of Scotland over matters of theology and ritual. Further splits in the Church of Scotland, especially in the 19th century, led to the creation of other Presbyterian churches in Scotland, including the Free Church of Scotland. In the 1920s, the Church in Wales became independent from the Church of England and became 'disestablished' but remains in the Anglican Communion.[217] Methodism and other protestant churches have had a major presence in Wales. The main religious groups in Northern Ireland are organised on an all-Ireland basis. Though collectively Protestants constitute the overall majority,[222] the Roman Catholic Church of Ireland is the largest single church. The Presbyterian Church in Ireland, closely linked to the Church of Scotland in terms of theology and history, is the second largest church followed by the Church of Ireland (Anglican) which was disestablished in the 19th century.
Sport is an important element of British culture, and is one of the most popular leisure activities of British people. Within the United Kingdom, nearly half of all adults partake in one or more sporting activity each week.[223] Some of the major sports in the United Kingdom "were invented by the British",[224] including football, rugby and cricket, and "exported various other games" including tennis, badminton, boxing, golf, snooker and squash.[225]
In most sports, separate organisations, teams and clubs represent the individual countries of the United Kingdom at international level, though in some sports, like Rugby Union, an all-Ireland team represents both Northern Ireland and the Republic, and the British and Irish Lions represent the isles as a whole. The UK is represented by a single team at the Olympic Games and at the 2008 Summer Olympics, the Great Britain team won 47 medals: 19 gold (the most since the 1908 Summer Olympics), 13 silver and 15 bronze, ranking them 4th.[226] In total, sportsmen and women from the UK "hold over 50 world titles in a variety of sports, such as professional boxing, rowing, snooker, squash and motorcycle sports".[223]
A 2006 poll found that association football was the most popular sport in the UK.[227] In England 320 football clubs are affiliated to The Football Association (FA) and more than 42,000 clubs to regional or district associations. The FA, founded in 1863, and the Football League, founded in 1888, were both the first of their kind in the world.[228] In Scotland there are 78 full and associate clubs and nearly 6,000 registered clubs under the jurisdiction of the Scottish Football Association.[228] One Welsh club play in England's Football League, one in the Premier league, and others at non-league level, whilst the Welsh Football League contains 20 semi-professional clubs. In Northern Ireland, 12 semi-professional clubs play in the IFA Premiership, the second oldest league in the world.[228]
Recreational fishing, particularly angling, is one of the most popular participation activities in the United Kingdom, with an estimated 3—4 million anglers in the country.[224][229] The most widely practised form of angling in England and Wales is for coarse fish while in Scotland angling is usually for salmon and trout.[224]
For centuries, artists and architects in Britain were overwhelmingly influenced by Western art history.[230] Amongst the first visual artists credited for developing a distinctly British aesthetic and artistic style is William Hogarth.[230] The experience of military, political and economic power from the rise of the British Empire, led to a very specific drive in artistic technique, taste and sensibility in the United Kingdom.[125] Britons used their art "to illustrate their knowledge and command of the natural world", whilst the permanent settlers in British North America, Australasia, and South Africa "embarked upon a search for distinctive artistic expression appropriate to their sense of national identity".[125] The empire has been "at the centre, rather than in the margins, of the history of British art", and imperial British visual arts have been fundamental to the construction, celebration and expression of Britishness.[231]
British attitudes to modern art were "polarised" at the end of the 19th century.[232] Modernist movements were both cherished and vilified by artists and critics; Impressionism was initially regarded by "many conservative critics" as a "subversive foreign influence", but became "fully assimilated" into British art during the early-20th century.[232] Representational art was described by Herbert Read during the interwar period as "necessarily... revolutionary", and was studied and produced to such an extent that by the 1950s, Classicism was effectively void in British visual art.[232] Post-modern, contemporary British art, particularly that of the Young British Artists, has been pre-occupied with postcolonialism, and "characterised by a fundamental concern with material culture ... perceived as a post-imperial cultural anxiety".[233]
The architecture of the United Kingdom is diverse; most influential developments have usually taken place in England, but Ireland, Scotland, and Wales have at various times played leading roles in architectural history.[234] Although there are prehistoric and classical structures in the British Isles, British architecture effectively begins with the first Anglo-Saxon Christian churches, built soon after Augustine of Canterbury arrived in Great Britain in 597.[234] Norman architecture was built on a vast scale from the 11th century onwards in the form of castles and churches to help impose Norman authority upon their dominion.[234] English Gothic architecture, which flourished between 1180 until around 1520, was initially imported from France, but quickly developed its own unique qualities.[234] Secular medieval architecture throughout Britain has left a legacy of large stone castles, with the "finest examples" being found lining both sides of the Anglo-Scottish border, dating from the Wars of Scottish Independence of the 14th century.[235] The invention of gunpowder and canons made castles redundant, and the English Renaissance which followed facilitiated the development of new artistic styles for domestic architecture: Tudor style, English Baroque, Queen Anne Style and Palladian.[235] Georgian and Neoclassical architecture advanced after the Scottish Enlightenment. Outwith the United Kingdom, the influence of British architecture is particularly strong in South India,[236] the result of British rule in India in the 19th century. The Indian cities of Bangalore, Chennai, and Mumbai each have courts, hotels and train stations designed in British architectural styles of Gothic Revivalism and neoclassicism.[236]
British culture is tied closely with its institutions and civics, and a "subtle fusion of new and old values".[174][237] The principle of constitutional monarchy, with its notions of stable parliamentary government and political liberalism, "have come to dominate British culture".[238] These views have been reinforced by Sir Bernard Crick who said:[141]
To be British seems to us to mean that we respect the laws, the elected parliamentary and democratic political structures, traditional values of mutual tolerance, respect for equal rights and mutual concern; that we give our allegiance to the state (as commonly symbolized by the Crown) in return for its protection.
British political institutions include the Westminster system, the Commonwealth of Nations and Her Majesty's Most Honourable Privy Council.[239] Although the Privy Council is primarily a British institution, officials from other Commonwealth realms are also appointed to the body.[240] The most notable continuing instance is the Prime Minister of New Zealand, its senior politicians, Chief Justice and Court of Appeal judges are conventionally made Privy Counsellors,[241] as formerly were the prime ministers and chief justices of Canada and Australia.[242][243] Prime Ministers of Commonwealth countries which retain the British monarch as their sovereign continue to be sworn as Privy Counsellors.[240]
Universal suffrage for all adult males was granted in 1918 and for adult women in 1930 after the Suffragette movement.[244] Politics in the United Kingdom is multi-party, with two dominant political parties: the Conservative Party and the Labour Party. The social structure of Britain, specifically social class, has "long been pre-eminent among the factors used to explain party allegiance", and still persists as "the dominant basis" of party political allegiance for British people.[245] The Conservative Party is descended from the historic Tory Party (founded in England in 1678), and is a centre-right conservative political party,[246] which traditionally draws support from the middle classes.[247] The Labour Party grew out of the trade union movement and socialist political parties of the 19th century, and continues to describe itself as a "democratic socialist party".[248] Labour states that it stands for the representation of the low-paid working class, who have traditionally been its members and voters.[248] The Liberal Democrats are a liberalist political party, and third largest in the United Kingdom. It is descended from the Liberal Party, a major ruling party of late-19th century Britain through to the First World War, when it was supplanted by the Labour Party.[249] The Liberal Democrats have historically drawn support from wide and "differing social backgrounds".[249] There are over 300 other, smaller political parties in the United Kingdom registered to the Electoral Commission.[250][251]
According to the British Social Attitudes Survey, there are broadly two interpretations of British identity, with ethnic and civic dimensions:
The first group, which we term the ethnic dimension, contained the items about birthplace, ancestry, living in Britain, and sharing British customs and traditions. The second, or civic group, contained the items about feeling British, respecting laws and institutions, speaking English, and having British citizenship.[252]
Of the two perspectives of British identity, the civic definition has become "the dominant idea ... by far",[116] and in this capacity, Britishness is sometimes considered an institutional or overarching state identity.[115][116][141] This has been used to explain why first-, second- and third-generation immigrants are more likely to describe themselves as British, rather than English, Scottish or Welsh, because it is an "institutional, inclusive" identity, that can be acquired through naturalisation and British nationality law;[255] the vast majority of people in the United Kingdom who are from an ethnic minority feel British.[256]
However, this attitude is more common in England than in Scotland or Wales; "white English people perceived themselves as English first and as British second, and most people from ethnic minority backgrounds perceived themselves as British, but none identified as English, a label they associated exclusively with white people". Contrawise, in Scotland and Wales, White British and ethnic minority people both identified more strongly with Scotland and Wales than with Britain.[257]
Studies and surveys have "reported that the majority of the Scots and Welsh see themselves as both Scottish/Welsh and British though with some differences in emphasis".[255] The Commission for Racial Equality found that with respect to notions of nationality in Britain, "the most basic, objective and uncontroversial conception of the British people is one that includes the English, the Scots and the Welsh".[258] However, "English participants tended to think of themselves as indistinguishably English or British, while both Scottish and Welsh participants identified themselves much more readily as Scottish or Welsh than as British".[258]
Some persons opted "to combine both identities" as "they felt Scottish or Welsh, but held a British passport and were therefore British", whereas others saw themselves as exclusively Scottish or exclusively Welsh and "felt quite divorced from the British, whom they saw as the English".[258] Commentators have described this latter phenomenon as "nationalism", a rejection of British identity because some Scots and Welsh interpret it as "cultural imperialism imposed" upon the United Kingdom by "English ruling elites",[259] or else a response to a historical misappropriation of equating the word "English" with "British",[260] which has "brought about a desire among Scots, Welsh and Irish to learn more about their heritage and distinguish themselves from the broader British identity".[261]
British, brit'ish, adj. of Britain or the Commonwealth.
Briton, brit'ὁn, n. one of the early inhabitants of Britain: a native of Great Britain.
Brit·ish (brĭt'ĭsh) adj.
n. (used with a pl. verb)
Wikimedia Commons has media related to: People of the United Kingdom |
|