Examination Information

Please click the relevant tab below for additional information.

 

Current Exam Information

An Admission Ticket is required for entrance into the Texas Bar Examination. You will only be admitted to the examination site shown on your Admission Ticket.  Admission Tickets will be uploaded to eFast accounts approximately four (4) weeks prior to the first day of the exam. If you have not received an admission ticket 25 days prior to the examination you have applied for, contact your Licensure Analyst at the Board of Law Examiner’s office. You may be ineligible to sit for the examination if you fail to do so.

 

Pass Lists from Previous Texas Bar Examinations

The links below will only confirm that a person passed a particular Texas Bar Examination and whether or not that person was eligible for admission to the State Bar of Texas when results were released.  To confirm whether or not an individual is a member of the State Bar of Texas, please refer to the State Bar of Texas website at http://www.texasbar.com.

 

 

July
  2015 2014 2013
February
2016 2015 2014 2013
         
July
2012 2011 2010 2009
February
2012 2011 2010 2009
         
July
2008 2007 2006 2005
February
2008 2007 2006 2005
         
July
2004 2003 2002 2001
February
2004 2003 2002 2001
         
July
2000      
February
2000      

 

Statistics & Analysis

Psychometric Audit of the Texas Bar Examinations Administered in 2013

ANALYSIS OF JULY 2004 TEXAS BAR EXAM RESULTS BY GENDER AND RACIAL/ETHNIC GROUP

Stephen P. Klein, Ph.D. and Roger Bolus, Ph.D.
GANSK & Associates
 

(This analysis was updated in June 2006.  Click here for details.)

December 15, 2004


Texas Government Code Sec. 82.0291 directed the Texas Board of Law Examiners to “compile a report indicating the number of applicants who fail the July 2004 bar examination. The data shall be aggregated by gender, ethnicity, and race. The report shall also include an analysis of the identifiable causes of failure and recommendations, if any, to address the causes of failure.”

The analyses described below were conducted to respond to this legislation. As background for what follows, we begin by summarizing the major features of the Texas bar exam and how the scores on it are computed and pass/fail decisions are made. We then describe the procedures that were used to gather and process the data for our analyses. Finally, we provide information about the size and nature of the differences in bar exam scores and passing rates among gender and racial/ethnic groups as well as the results of our analyses of certain factors that are and are not related to these differences. The specific questions we address in our analyses are as follows:

1. Do men and women have comparable bar exam passing rates and test scores?   Findings

2. Do different racial/ethnic groups have comparable bar exam passing rates and scores?  Findings

3. Do the differences in bar exam passing rates and scores among racial/ethnic group correspond to the differences in their admissions credentials and law school grades?  Findings

4. Were some bar exam preparation activities associated with higher scores?   Findings

5. As a group, do the students at some law schools generally score higher or lower on the bar exam than what would be expected on the basis of their mean LSAT scores?  Findings

Texas Bar Exam Components

The Texas Bar Exam is a two and a half-day test. There is one day for the Texas essay section, one day for the Multistate Bar Exam, and a half-day for the combination of the Multistate Performance Test and the Texas Procedure and Evidence test. The major features of these four components are as follows:

Multistate Bar Exam (MBE). The MBE is a six-hour, 200-question, multiple-choice test. MBE questions (or “items”) are prepared and scored by American College Testing (ACT) under the general direction of the National Conference of Bar Examiners. The MBE has an approximately equal number of items in each of the following six subjects: Constitutional Law, Contracts, Criminal Law, Evidence, Property, and Torts. An applicant’s MBE “raw” score is the number of questions answered correctly.

Roughly 30 percent of the MBE questions that are asked on one administration (such as July 2004) have been used previously. ACT uses the data on these repeated items to adjust MBE raw scores for possible differences in average question difficulty across administrations. As a result of this calibration process (which is called “equating” or “scaling”), a given MBE “scale” score indicates about the same level of proficiency regardless of the administration on which it was earned.

Multistate Performance Test (MPT). Texas administers one 90-minute MPT question or “task”, consisting of a legal analysis and writing assignment. This task is developed under the direction of the National Conference of Bar Examiners. There is a new task for each administration. Texas readers grade the answers on a 1 to 6 scale in half-point intervals.

An MPT task assesses certain practice oriented legal research, analysis, and writing skills. A task consists of a File that looks like a typical lawyer file (e.g., letters, memos, reports, and the like) and includes relevant and irrelevant materials and a Library with all the case law, statutes, and secondary materials needed to deal with various matters in a hypothetical case. Candidates use the File and Library to complete a realistic task, such as drafting a memo to a senior lawyer, a letter to a client or opposing counsel, or a brief to be filed with a court.

Texas Essay Test. The six-hour essay portion of the Texas exam consists of 12 questions in such areas as Business Associations, Wills, Real Property, and Family Law. Members of the Texas Board of Law Examiners, with the assistance of professional editors, draft the questions. Board members and experienced attorney graders then score the answers to each question on a 1 to 25-point scale. The maximum possible essay raw score is 12 x 25 = 300 points.

Texas Procedure and Evidence (P&E) Test. The 90-minute P&E test contains 20 short- answer civil questions and 20 short-answer criminal questions. The Texas Board of Law Examiners is responsible for creating these questions and they and their associates grade the responses to each question on a 0 to 5 scale. Texas divides the total P&E raw score on each section by 2 (and rounds the result to a whole number) so that the maximum possible total P&E raw score across the two sections is 100 points.

Total Scores and Pass/Fail Decisions

Texas converts total essay raw scores to the same scale of measurement as that used for the MBE. This is done to adjust the essay scores for possible differences in average question difficulty across administrations. Scaling involves assigning the highest total essay raw score the same value as the highest MBE scale score in Texas, the second highest total essay raw score the same value as the second highest MBE scale score, and so on until the lowest total essay raw score is the assigned the same value as the lowest MBE scale score. The converted scores are called essay “scale” scores. This same procedure is used to convert MPT and P&E raw scores to scale scores.

Texas uses the following formula to compute each applicant’s total scale score so that the weights assigned to the MBE, Essay, MPT, and P&E tests are 40, 40, 10, and 10 percent, respectively:

Total scale = 2(MBE Scale) + 2(Essay Scale) + (MPT Scale)/2 + (P&E Scale)/2

Applicants with total scale scores of 675 or higher pass. All others fail. This pass/fail standard (which corresponds to a 135 on the MBE scale of measurement) is comparable to the standards used by most other states.

Analysis Sample Data

The application form for the July 2004 exam contained a section in which candidates indicated their gender and racial/ethnic group. Almost all of those taking the exam also completed a questionnaire that was distributed during a break in the test session, although everyone did not answer every question. A copy of this survey is attached to the end of this report.

The 2003 applicants whose data are used in this report graduated from the nine law schools in Texas that are accredited by the American Bar Association. These schools provided us with the Law School Admission Test (LSAT) scores, undergraduate grade point averages (UGPAs), and law school grade point averages (LGPAs) for their graduates who took the July 2004 Texas bar exam. None of the data furnished by the law schools has been disclosed to the Texas Board of Law Examiners. Such data are considered the property of the individual law schools and will not be disclosed at any time. The Texas Board of Law Examiners provided us with these applicants’ bar exam scores and repeater status data. All data reported by the Texas Board of Law Examiners has been disclosed to the law schools pursuant to Texas Government Code Sec. 82.029 and cannot be further disclosed in accordance with that statute.

All data were furnished to us without revealing the identity of any candidate and have been linked through a common study ID number for each candidate. The confidentiality of these data was preserved by employing procedures that prevented us from having access to applicant names and prevented the Texas Board from having access to the data provided by the law schools.

Preliminary Analyses

It is well recognized that grading standards vary across law schools. A 3.00 LGPA at one school may correspond to a substantially higher or lower level of proficiency than a 3.00 at another school. However, several analyses require combining LGPAs across schools. Thus, to adjust for possible differences in grading standards among schools, we converted the LGPAs within a school to a score distribution that had the same mean and standard deviation as the distribution of the LSAT scores at that school. This conversion is used throughout this report.

Applicants indicated their gender and racial/ethnic group on their application form. The analysis sample contained applicants from 19 racial/ethnic groups, but there were only a few candidates in some of these groups. This led us to form the following five clusters for our analyses:

Asian = Asian, Chinese, Japanese, Korean, Pacific Islander, Polynesian, and Vietnamese
Black = African American, African, and Black
Hispanic = Hispanic, Mexican, Cuban, Puerto Rican, Latin, and Central/South American
White = Caucasian and White
Other = All others (includes 5 Native Americans and racial/ethnic group omitted)

Research Questions and Answers

1) Do men and women have comparable bar exam passing rates and test scores?

Yes. Men and women had virtually identical passing rates (74.73 percent and 74.85 percent, respectively). There also were nearly identical numbers of men and women in our nine-school sample (1005 and 998, respectively). Table 1a shows that men had a slightly higher mean LSAT score than women whereas the reverse was true on UGPA. Men and women had very similar mean LGPAs.

Table 1a
Mean UGPAs, LSAT Scores, and LGPAs By Gender

Score Men  Women  All Takers
UGPA 3.15 3.29 3.22
LSAT 153.8 153.0 153.4
LGPA 152.8 154.0 153.4

Standard deviations for all takers for UGPA, LSAT, and LGPA were 0.47, 7.34, and 10.41, respectively.

On the average, men scored slightly higher than women on the MBE whereas the reverse was true on other sections (see Table 1b). This finding is consistent with the results regarding gender effects that are presented in technical reports regularly published by the California Committee of Bar Examiners. The differences in mean scores between gender groups balanced out so that overall, men and women had virtually identical mean total scale scores.

Table 1b
Mean Bar Exam Scale Scores By Gender and For All Takers

Score  Men  Women  All Takers
MBE 142.6 140.2 141.4
Essay 141.4 143.0 142.2
MPT 140.0 143.0 141.5
P&E 142.0 143.1 142.6
Total Scale 708.5 708.9 708.7

Standard deviations for the MBE, Essay, MPT, and P&E scores were all 13.0. The Total scale score standard deviation was 54.5.

2) Do different racial/ethnic groups have comparable bar exam passing rates and scores?

No. Using the racial/ethnic group designations noted above, Whites and Asians had statistically significantly higher bar exam passing rates and mean bar exam test scores than their classmates. This held for first time takers and repeaters.

Table 2a shows the number of applicants by racial/ethnic group and repeater status. Table 2b shows their passing rates. These data indicate that 75 percent of the 2003 applicants in the analysis sample passed. The passing rate for first timers (81 percent) was nearly double the rate for repeaters (42 percent). Together, Blacks and Hispanics comprised 18 percent of the first timers, but 36 percent of the repeaters.

Table 2a
Number of Takers By Racial/Ethnic Group and Repeater Status

Group  White  Asian  Hispanic  Black  Other  Total
First Timers 1290 75 162 138 35 1700
Repeaters 178 13 47 58 7 303
Total 1468 88 209 196 42 2003

Table 2b
Percent Passing By Racial/Ethnic Group And Repeater Status

Group White  Asian  Hispanic Black Other  Total
First Timers 85 80 69 53 74 81
Repeaters 49 46 30 28 29 42
Total 81 75 60 45 67 75


Table 3a (which uses the data on all 2003 of the candidates in the analysis sample) shows each group’s mean UGPA, LSAT score, and LGPA.

Table 3b shows their mean bar exam scores. These data indicate that a group’s mean scale score was very similar across the four sections of the exam. The sole exception was that Asian applicants did especially well on the MPT, but this could easily be due to chance given the comparatively low score reliability of a single MPT task. Hispanics and Blacks did about as well on the MBE as they did on the written portions of the exam. Thus, overall, exam format had no effect on the differences in mean total scores between groups.

Table 3a
Mean UGPA, LSAT, and LGPA By Racial/Ethnic Group

Score  White  Asian  Hispanic  Black  Other
UGPA 3.28  3.27  3.06  2.94  3.18
LSAT 154.6 154.9 149.5 146.6 152.5
LGPA 154.9 154.2 149.3 146.8 151.6

Table 3b
Mean Texas Bar Examination Scores By Racial/Ethnic Group

Score  White  Asian  Hispanic  Black  Other
MBE 143.3  138.0 136.2 134.0 141.9
Essay 144.1 141.5 137.6 133.6 140.8
MPT 142.9 145.3 136.7 134.9 138.7
P&E 144.0 138.9 139.9 136.9 142.0
Total Scale 717.7 700.6 685.5 670.5 705.3

3) Do the differences in bar exam passing rates and scores among racial/ethnic group correspond to the differences in their admissions credentials and law school grades?

Yes. We found that on the average, the applicants in different racial/ethnic groups performed as well on the bar exam as would be expected on the basis of their law school admission credentials and law school grades.

We examined this matter in two ways. First, we noted that the 8-point difference in mean LGPA between Whites and Blacks was equivalent to 0.78 standard deviation units. This was nearly identical to the difference (in standard deviation units) between these groups’ mean total scale scores. The size of the difference between Whites and Hispanics on LGPA also was very similar to the difference (in standard deviation units) between these groups in total scale scores. Asians were the only group that did not do quite as well on the bar exam as would be predicted on the basis of their LGPAs.

Our second (and more statistically sophisticated and precise) approach involved constructing two “multiple regression” equations to predict an applicant’s total bar exam scale score. One of these equations included the applicant’s UGPA, LSAT score, and LGPA. The other equation contained these same variables plus the applicant’s gender and racial/ethnic group. This analysis found that racial/ethnic group had almost no relationship with bar exam scores once there is control on the applicant’s admissions credentials and law school grades.

Specifically, the first equation explained 37.2 percent of the variance in total bar exam scores whereas the second explained 37.8 percent, i.e., just 0.6 percent more. Thus, the addition of gender and racial/ethnic group to the equation did not have any practical effect on predictive accuracy. All the groups (including Asians) performed as expected. In short, minority and non-minority bar exam scores were very consistent with what would be expected given the differences in their admissions credentials and law school grades. The exam did not increase or decrease the differences between groups that were present before they sat down to take the test.

Our analyses within and across schools also indicated that there is a great deal of variance in bar exam scores that is not explained by UGPA, LSAT, LGPA, gender, and racial/ethnic group. A substantial portion of the differences in bar exam scores among candidates must therefore be due to other factors, such as how the applicants prepared for the exam.

4) Were some bar exam preparation activities associated with higher scores?

Yes. Almost all the applicants in our analysis sample reported having participated in one or more commercial bar review courses in the six months prior to taking the exam. To investigate whether some of these activities were more helpful than others, we constructed a regression equation that contained the applicant’s LSAT score, LGPA, and their response to each of the questions in the student survey (see attached copy of this questionnaire).

This analysis found that applicants tended to receive 4 to 10 more total scale score points if they did one or more of the following during the six months prior to taking the exam: attend lecture and discussion sessions, use Internet lessons, and use hard copy study guides and books. The percentage of candidates using these methods were: 85, 28, and 95, respectively (many applicants used more than one strategy).

We were surprised that the use of hard copy study materials had a statistically significant effect because almost all the candidates used them. It is evident that those who did not use them were not well served. The effect of using Internet lessons was not as strong as the effects of using the other two methods.

About 21 percent of the Black and Hispanic applicants (and 15 percent of all of the other applicants) reported that they worked 20 or more hours during the five weeks prior to taking the July 2004 exam (excluding any paid leave time they may have received from their employer to study for the exam). On the average, the applicants who worked earned about 15 total scale score points less than their classmates with comparable LSAT scores and LGPAs. To put this statistically significant 15-point difference in total scale scores in perspective, it is comparable to the unique effect (i.e., after controlling on LSAT and LGPA) of being a repeater rather than a first time taker.

5) As a group, do the students at some law schools generally score higher or lower on the bar exam than what would be expected on the basis of their mean LSAT scores?

Generally No. As noted on the Texas Board of Law Examiners’ Website (www.ble.state.tx.us), there are large differences in bar exam passing rates among schools. We found that almost all of these differences can be explained by differences in the admissions scores of the students they graduate. For example, there is a nearly perfect relationship between a law school’s mean total bar exam scale score and its mean LSAT score (the correlation is .98 out of a possible 1.00). Many of a law school’s graduates do better or worse on the bar exam than what would be expected on the basis of their own LSAT scores, but these differences almost entirely balance out when the data are analyzed by school. Nevertheless, one school’s mean total bar exam scale score was 10 points higher than what would be expected on the basis of its mean LSAT score (the odds of a difference of this size occurring by chance are less than 5 in 100).

Conclusions and Recommendations

Men score slightly higher than women on the MBE while the reverse is true on the rest of the exam so that overall, they have nearly identical total scores and passing rates on the Texas bar examination. Men and women candidates in Texas also have comparable admissions credentials.

Black and Hispanic candidates are not spread evenly across the nine Texas law schools. They are much more likely to attend some schools than others. There also are large differences in passing rates among schools. However, the large differences in passing rates among racial/ethnic groups are not related to which law schools they attend because almost all the schools do about as well on the bar exam as would be expected on the basis of the mean LSAT scores of their graduates. That is what is driving the differences in bar scores among groups.

The differences in scores among racial/ethnic groups were quite similar across the different sections of the exam. With the possible exception of Asians who did especially well on the MPT, no section stood out as being unusually easy or difficult for a particular racial/ethnic group. In addition, total bar exam scores essentially mirror the differences in these groups’ admissions credentials and law school grades. Thus, the bar exam does not appear to widen or narrow the gap in scores that was present between the groups before they sat for the exam.

We also found that a significant portion of the differences in bar exam scores between applicants is not attributable to differences in their admissions credentials, law school grades, gender, or racial/ethnic group. A small but statistically significant piece of this remaining variance is related to whether the candidate worked for more than 20 hours during the five weeks leading up to the exam. And, Black and Hispanic applicants were about 1.5 times more likely to be among those who worked during this period than were other applicants. A few other preparation factors also were related to scores, such as participation in lecture and discussion sessions presented by a commercial bar review course.

Given the findings above, we see no reason to make any changes in the nature of the exam itself. It appears to be well balanced and fair to all takers. Moreover, the results on it correspond closely to the law schools’ own evaluations of their graduates’ abilities (as reflected by the generally high correlations between law school grades and bar exam scores at each school). Nevertheless, the findings about preparation factors suggest that something might be done in this area to improve minority bar passage rates. This might involve providing funding (and perhaps scholarships to bar review courses) to students who did well in law school but may not have all the financial resources they need to prepare for the exam in the same way as their classmates.

July 2004 Texas Bar Examination Examinee Survey

Please put a checkmark (√) in the Yes or No box in response to each question below:

Question Yes No
1. Were you employed for an average of 20 hours/week or more during the five weeks preceding the July 2004 Bar Exam? (Do not count any paid leave time to study for the exam).    
2. Was English the primary language spoken in your home when you were growing up?    
3. Are you the first person in your family to receive a college degree?    
4. Are you the first person in your family to receive a graduate or professional degree?    
5. Did you take any of the following components of a commercial bar review course in the six months prior to the July 2004 bar exam?    
  a. Lecture and discussion sessions    
  b. Audio and/or video tapes or CDs    
  c. Lessons on the Internet    
  d. Hard copy study guides and books    
6. Did you participate in any law school programs designed to improve your test-taking or study skills during the following:    
  a. Pre-enrollment Summer session    
  b. 1st Year of law school    
  c. 2nd Year of law school    
  d. 3rd Year of law school    



Dr. Stephen Klein is the Senior Partner in the consulting firm of GANSK and Associates. In that capacity, he has done research and consulted on a wide range of issues for the National Conference of Bar Examiners, more than two dozen state boards of bar examiners, over a dozen law schools, and the Association of American Law Schools. He also has consulted for certification boards in accounting, acupuncture, actuarial science, dentistry, medicine, podiatry, psychology, and teaching. He has testified as an expert in state and federal courts and at legislative and administrative hearings regarding criminal justice, testing, educational, personnel, voting rights, and other matters. He served as the federal court's appointed technical advisor on a large class-action suit involving measurement issues and consulted for the National Academy of Sciences, the National Science Foundation, the Knight Commission on Intercollegiate Athletics, the Little Hoover Commission, and many other public and private agencies and organizations. Dr. Klein also is a Senior Research Scientist at the RAND Corporation in Santa Monica, California where he has led studies on educational, health, military manpower, and criminal justice issues.

Dr. Klein received his BS from Tufts University and his M.S. and Ph.D. in Industrial Psychology from Purdue University. Before coming to RAND in 1975, he was a Research Psychologist with the Educational Testing Service in Princeton and Associate Professor in Residence at UCLA where he chaired the Research Methods division in the Graduate School of Education. Dr. Klein has over 250 publications, he is on the editorial board of the Review of Educational Research, and he is a member of the American Statistical Association, the American Psychological Association, the American Educational Research Association, and the National Council on Measurement in Education.

Dr. Roger Bolus serves as Senior Partner, Research Solutions Group, a company providing technical and analytical services to research endeavors in the areas of education, healthcare and large scale testing. Roger also has an appointment as Director, Psychometrics at the Center For Neurovisceral Sciences in the Department of Medicine at UCLA. For the past 25 years, he, in collaboration with Dr. Stephen Klein, has provided statistical, psychometric and data management consultation to state bar examination boards throughout the country. Among current clients are the state bars of California, Illinois, Maryland, Ohio, Massachusetts, Texas, Delaware and Nevada. In this work, Roger has developed an expertise in the design, management and analysis of large, complex databases related to legal testing. Dr. Bolus’ current interests are in the areas of adaptability of the Internet to the administration and scoring of open-ended responses in high stakes testing. Dr. Bolus received his M.A. and Ph.D. from the University of California, Los Angeles School of Education with a specialization in Educational Testing, Measurement and Evaluation (1981). He is the author or co-author of over 30 articles and technical reports, and has spoken at several national conferences on the topics of testing and outcome assessment.

Texas Bar Exam Statistics since February 1997

Click on a link below.

 
July   2015 2014 2013
February 2016 2015 2014 2013
         
July 2012 2011 2010 2009
February 2012 2011

2010

2009
July 2008 2007 2006 2005
February 2008 2007 2006 2005
         
July 2004 2003 2002 2001
February 2004 2003 2002 2001
         
July 2000 1999 1998 1997
February 2000 1999 1998 1997

 

In-state pass rate statistics for

Selected Answers

General Recommendations

  • Carefully read the question and the “call of the question” (what the question asks you to do.)

  • Pay attention to the facts presented without assuming additional facts.

  • Include more than a mere conclusion when asked to explain the answer fully.

  • Respond to the “call of the question” (what the question asks you to do) and stay on track.

  • Practice writing in complete sentences and composing paragraphs.

  • Organize your responses, and answer subparts, if any, in the order asked.

  • Strive for clarity and good communication in writing.

  • Avoid lengthy or unnecessary discussion of general or extraneous matters.

Also, you are encouraged to read and be familiar with the Texas statutes, code provisions or rules pertaining to the Texas essay and procedure and evidence exam subjects. This is recommended regardless whether you have access to commercially-produced outlines or review materials.


Starting with the February 2009 exam, the Texas Board of Law Examiners began publishing selected examinee answers for essay questions 1 through 12 (in lieu of commenting on common problems or errors for these items).  As of July 2011, selected examinee answers were also published for the MPT.  These are made available only for the limited, personal use of Texas Bar Exam applicants. The publication of past exam questions and selected answers (or comments for the MPT and Civil and Criminal Procedure and Evidence exam segments) is not intended to indicate any specific legal issue or issues that will be tested on a future exam. Do not use them as a substitute for learning the subjects covered on the exam.

Overall, these selected essay and MPT answers help to demonstrate the general length and quality of responses that earned above average scores on the indicated administration of the essay portion of the bar examination. However, these are unrevised answers written by actual examinees under time constraints without access to law books. As such, these essays do not always correctly identify or respond to all issues raised by the question, and they may contain some extraneous or incorrect information. They do not, in all respects, accurately reflect Texas law or its application to the facts. These essays are not intended as “model answers” and should never be taken by anyone as legal advice.

The Texas Board of Law Examiners does not write the questions for the MPT and MBE. These are products of the National Conference of Bar Examiners (NCBE). Comments from NCBE’s former Director of Testing, Dr. Susan Case, about MBE preparation can be found here MBE Studying Advice. Also, the NCBE offers its own helpful information at its website, www.ncbex.org, for:

  • MBE scoring,

  • On-line practice exams and downloadable study materials,

  • Past MPT and MBE exam questions and answers, and

  • MPT grading guidelines.

Note: Although the NCBE has in the past made some study aids available without charge, it has also charged fees for its most recent past exams and its on-line practice exams.

The following Texas Bar Exam items are available for viewing only with Adobe Acrobat Reader. By clicking on the link for viewing or downloading any or all past Texas Bar Exam questions or selected essay  or MPT answers (or comments on the Criminal or Civil Procedure and Evidence exams), you indicate that you have read the above information and you understand and agree that these are for Texas Bar Exam applicants’ personal use only and may not be redistributed or republished in any form, whether electronic, written or printed.

Please note that selected answers are not available for all questions below. If you open an answer and the PDF states "Not Available", there are no selected answers for that question.

To continue, select a link below:

 

 

* * * * Questions * * * *

* * * * * * * * * Answers * * * * * * * * *

Feb 2016

1-6

7-12

Civil&Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

July 2015

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

Feb 2015

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

July 2014

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

Feb 2014

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

July 2013

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

Feb 2013

July 2012

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

Feb 2012

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

July 2011
Feb 2011
July 2010
Feb 2010
July 2009

Feb 2009

1-6

7-12

Civil

Crim

1

2

3

4

5

6

7

8

9

10

11

12

MPT

Civil

Crim

 

Past Exams

Get Acrobat Reader

The following Texas Bar Examinations are available for viewing only with Adobe Acrobat Reader.

 

AM Essays PM Essays Procedure & Evidence
     
February 2016 February 2016 February 2016 Civil & Criminal
     
July 2015 July 2015 July 2015 Civil
July 2015 Criminal
     
February 2015 February 2015 February 2015 Civil
February 2015 Criminal
     
July 2014 July 2014 July 2014 Civil
July 2014 Criminal
     
February 2014 Februrary 2014 February 2014 Civil
February 2014 Criminal
     
July 2013 July 2013 July 2013 Civil
July 2013 Criminal
February 2013 February 2013 February 2013  Civil
February 2013  Criminal
     
July 2012 July 2012 July 2012  Civil
July 2012  Criminal
February 2012 February 2012 February 2012  Civil
February 2012  Criminal
     
July 2011 July 2011 July 2011  Civil
July 2011  Criminal
February 2011 February 2011 February 2011  Civil
February 2011  Criminal
     
July 2010 July 2010 July 2010  Civil
July 2010  Criminal
February 2010 February 2010 February 2010  Civil
February 2010  Criminal
     
July 2009 July 2009 July 2009  Civil
July 2009  Criminal
February 2009 February 2009 February 2009  Civil
February 2009  Criminal
     
July 2008 July 2008 July 2008  Civil
July 2008  Criminal
February 2008 February 2008 February 2008 Civil
February 2008 Criminal
     
July 2007 July 2007 July 2007 Civil
July 2007 Criminal
February 2007 February 2007 February 2007 Civil
February 2007 Criminal
     
July 2006 July 2006 July 2006 Civil
July 2006 Criminal
February 2006 February 2006 February 2006 Civil
February 2006 Criminal
     
July 2005 July 2005 July 2005 -- Civil
Criminal
February 2005 February 2005 February 2005 -- Civil
Criminal
     
July 2004 July 2004 July 2004 -- Civil
Criminal
February 2004 February 2004 February 2004
     
July 2003 July 2003 July 2003
February 2003 February 2003 February 2003
     
July 2002 July 2002 July 2002
February 2002 February 2002 February 2002
     
July 2001 July 2001 July 2001
February 2001 February 2001 February 2001
     
July 2000 July 2000 July 2000
February 2000 February 2000 February 2000