Squashed 'third_party/protobuf/' changes from e35e248..48cb18e

48cb18e Updated change log for 3.6.1 release
9e1286b Updated version numbers to 3.6.1
e508fc0 Check the message to be encoded is the wrong type. (#4885) (#4949)
fc90fd6 Make assertEquals pass for message (#4947)
d85ffdc Merge pull request #4924 from xfxyjwf/3.6.x
8356d27 Add continuous test for ruby 2.3, 2.4 and 2.5 (#4829)
7fdebf2 Fix cpp_distcheck
6c82411 Fix php conformance test.
11e2eca protoc-artifacts: Update centos base from 6.6 to 6.9
6fc2bac Updated Docker setup to use GCC 4.8
b3e4e3a Remove js_embed binary. (#4709)
0b3b470 Fix cpp benchmark dependency on mac
ae8def6 Comment out unused command from release script.
9327dc7 Run autogen.sh in release script.
9209a41 Add protoc release script for Linux build.
4d0fbd1 Add cpp tests under release docker image.
474fd31 Update protoc build scripts.
b23429e Fix 32bit php tests
8758cc1 Fix php tests
bd1224e fix golang kokoro linux build
359889b fix python cpp kokoro build
c1e6c4d fix linux kokoro build in docker
22503a0 fix for API change in PHP 7.3 (#4898)
e529d16 Make ruby release configs consistent with protoc.
82019f9 Merge pull request #4900 from xfxyjwf/3.6.x
0e38f9e Fix bazel build of examples.
b5975c1 Add kokoro bazel configs for 3.6.x branch.
4a4a60b Merge pull request #4880 from nashimus/3.6.x
e184577 Merge pull request #4878 from acozzette/fix-msvc-initialization
1f7837a Additional support for building and deploying ppcle_64 artifacts
a9abc78 Fix initialization with Visual Studio
f7ada12 Build ruby gem on kokoro (#4819)
56d2753 Rename build_artifacts.cfg to release.cfg (#4818)
b907a03 Add files to build ruby artifact for mac on kokoro (#4814)
ad85c3b Added Kokoro protoc release build for OS X (#4770)
885be9c Work around MSVC issue with std::atomic initialization (#4777)
b0a8220 Added Kokoro Windows release build config for protoc (#4766)
ce04481 Use legacy name in php runtime (#4741)
d6353b4 Update php version to 3.6.0 (#4736)
ab8edf1 Merge pull request #4713 from acozzette/changelog
6ed0141 Merge pull request #4730 from acozzette/xcode
a3f31bf Update code to work for Xcode 10b1 (#4729)
9b77e9e Removed javanano from post_process_dist.sh
adf84b4 Updated the change log with changes for 3.6.0
7e199b9 Merge pull request #4706 from acozzette/cxx-11
39d2f70 Require C++11 and pass -std=c++11
a239ed2 Merge pull request #4702 from TeBoring/3.6.x
4f5eb10 Move methods out of class (#4697)
397ddf9 Add back GeneratedClassName to public (#4686)
d31864a Merge pull request #4696 from acozzette/csharp-fix
6c12ff5 Merge pull request #4695 from TeBoring/3.6.x
9d64d74 Removed duplicate using statement from ReflectionUtil.cs
5bf35ec Fix php memory leak test (#4692)
edaaea0 Merge pull request #4687 from acozzette/js-map-parsing-fix
d1af029 Fixed JS parsing of unspecified map keys
6f723a6 Always add -std=c++11 for mac (#4684)
4885b80 Merge pull request #4675 from TeBoring/3.6.x
dadc954 Fix array constructor in c extension for compatibility (#4667)
2774e54 PHP namespaces for nested messages and enums (#4536)
8b336f8 Implement array constructor in php c extension.
c9b404d PHP array constructors for protobuf messages (#4530)
7eba624 Add missing ruby/tests/test_ruby_package.proto
c19fcee Allows the json marshaller to be passed json marshal options (#4252)
5289ee0 Adopt ruby_package in ruby generated code. (#4627)
d8483a9 Adopt php_metadata_namespace in php code generator (#4622)
449e532 Merge pull request #4673 from acozzette/memory-leak-fix
daf039b Make sure to delete temporary maps used by FileDescriptorTables
15cde29 Merge pull request #4625 from liujisi/3.6.x
45eb28b Update version number to 3.6.0
b61dd9d Add file option php_metadata_namespace and ruby_package (#4609)
2213c1c Merge pull request #4538 from Mizux/patch-2
7377d81 Throw error if user want to access message properties (#4603)
5f7334f Avoid direct check of class name (#4601)
5f9232b use brew install instead of easy_install in OSX (#4537)
25625b9 Merge pull request #4590 from PetterS/undefined_fix
d14cacd Fix error in Clang UndefinedBehaviorSanitizer
513b35d Add VS2017 optional component dependency details to the C# readme (#4128)
4a09836 Fix python ext build on kokoro (#4527)
92898e9 Merge pull request #4586 from chronoxor/master
3474155 MinGW build failed
b0403a7 Merge pull request #4583 from chronoxor/master
f80a886 Cygwin build failed
05c2d01 Fix RepeatedField#delete_if (#4292)
0869b1a Add space between class name and concat message (#4577)
306f4e3 Merge pull request #4581 from Yeolar/3rd_rpc_raster
7d97808 [objectivec] Fix memory leak of exceptions raised by RaiseException() (#4556)
35e4430 Add third-party RPC implementation: raster - a network framework supports pbrpc by 'service' keyword.
fc922d1 Merge pull request #4568 from hectim/master
7f2c3ce Merge pull request #4550 from Mizux/master
1b219a1 Fix to allow AOT compilers to play nicely with reflection
9ba2fd3 typo
a21f225 Merge pull request #4553 from pherl/ruby
9972e16 Set ext.no_native = true for non mac platform
2bd7f51 fix duplicate mkdir in update_file_lists.sh
c3b152c CMake: Update CXX Standard management
3ad8efc Add .proto files to extract_includes.bat
8417871 Move to Xcode 9.3 which also means a High Sierra image.
b59da6d Remove the iOS Test App.
c36eeed Merge pull request #4520 from BSBandme/fix_kokoro_benchmark_build
4ca46ed Write messages to backing field in generated C# cloning code (#4440)
0dc4d75 Merge pull request #4504 from xfxyjwf/lite
a29e2bf Update Makefile.am for Java lite files.
8f35073 Fix benchmark build
9497a65 Merge pull request #4517 from rcane/feature/vc2017_build_fix
a9d9326 Merge pull request #4510 from BSBandme/fix_kokoro_benchmark_build
7d6d5f9 Fixed a Visual Studio 2017 build error. (#4488)
0921345 fix java benchmark, fix dashboard build
7d55040 Cleanup + documentation for Java Lite runtime.
320d56c Merge pull request #4478 from BSBandme/proto2_to_proto3_tools
e3af023 fix python
76d76ae fix conflicts
c703061 Add gogo benchmark
5fb73f8 Merge pull request #4415 from BSBandme/experiement
805174e Add script for run and upload the benchmark result to bq
d7d863e fix json_decode call parameters (#4381)
13e627a includes the expected class in the exception, otherwise the error is harder to track down (#3371)
451e044 Add __init__.py files to compiler and util subpackages (#4117)
3b6d027 For windows, all python version should use /MT (#4468)
dce8229 Add the files added in #4485.
ce24b8a Update Xcode settings
da3ce67 Deliberately call simple code to avoid Unity linker pruning
a72da27 Merge pull request #4475 from chenchuanyin/patch-1
e7e6c6b Merge pull request #4283 from ObsidianMinor/csharp/better-test-runners
f72ac9f Updated csharp/README.md to reflect testing changes
d3e8a54 Fix problem: cmake build failed in c++11  by clang
c931743 Merge branch (#4466)
579f81e js message support for jstype string on integers (#4332)
40d6eca Merge pull request #4467 from xfxyjwf/error
8f88a50 Improve error message when googletest is missing.
7f92711 Merge pull request #4411 from pravin-dsilva/protobuf-ppc64le
4274e6a Merge pull request #4447 from ejona86/cleaner-protoc-artifacts
c934d1d Merge pull request #4452 from xfxyjwf/doc
aab0f89 Merge pull request #4464 from thomasvl/includes3
bd941d5 Don't generate imports for the WKTs unless generating the WKTs.
e998b8f Add compile test sources for to test include order.
bca797d Trim imports for bundled generated protos.
014e76e Update instructions about getting protobuf source.
cdbfdd8 protoc-artifacts: Use ENTRYPOINT to enable devtoolset-1.1
67c1cd4 protoc-artifacts: Avoid storing temporary files and use fewer layers
5f052ae protoc-artifacts: Avoid checking out protobuf code
7bf47a6 Merge pull request #4433 from xfxyjwf/license
c8ed695 Merge pull request #4445 from xfxyjwf/badge
eac0e92 Add kokoro build status badges.
59c4328 Merge pull request #4442 from xfxyjwf/clean
107cd70 Delete unused directories.
d3d7cdb Merge pull request #4439 from acozzette/remove-atomicops-stub
612b670 Updated .gitignore to exclude downloaded gmock/ directory
4875dfe Removed atomicops.h since it is no longer used
2537bea Merge pull request #3794 from jskeet/reflection
9c05c35 Address review comments
7b19d20 Add extra C# file to Makefile.am
8e23d4e Work around an "old runtime" issue with reflection
aa59eaa Introduce a compatiblity shim to support .NET 3.5 delegate creation
8ba420f Change C# reflection to avoid using expression trees
63bba9b Merge pull request #4432 from xfxyjwf/rmnanokokoro
5050269 Merge pull request #4434 from xfxyjwf/buildstatus
9f5bded Remove broken build status icons.
9e080f7 Cleanup LICENSE file.
68de0cf Delete javanano kokoro build configs.
3c5442a Include googletest as a submodule (#3993)
1156ee7 source code info for interpreted options; fix source code info for extension range options (#4342)
d34e319 Merge pull request #4431 from xfxyjwf/rmnano
379b7ff Fixes MSVC compiler warning C4800 "Forcing value to bool 'true' or 'false'" (#4350)
ac67376 Merge pull request #4395 from stone4774/fixbug_enum2json2
d5a0024 Remove javanano.
416d418 Merge pull request #4424 from egorpugin/patch-1
dc68d98 Fix missing LIBPROTOC_EXPORT.
2c963d3 Merge pull request #4413 from pmuetschard/msvc
eff52b1 Merge pull request #4422 from acozzette/ruby-conformance
cf7af01 Merge pull request #4421 from acozzette/fix-bazel-build
f91cf05 Updated Ruby conformance test failure list
5bed368 Added missing .inc files to BUILD
a7a746f Merge pull request #4346 from BSBandme/performance_result
745ef89 Add performance.md and add instruction for linking tcmalloc
227363b Merge pull request #4412 from acozzette/remove-old-files
a6957f2 Don't assume Windows builds use MSVC.
0c5fcde Removed some unused C++ source files
36ba04b Add support for power ppc64le
c99f527 Merge branch 'master' into fixbug_enum2json2
d053271 Use the first enum value instead of 0 in DefaultValueObjectWriter::FindEnumDefault
ed4321d Merge pull request #4387 from acozzette/down-integrate
e436ee0 Merge pull request #4361 from BSBandme/go_benchmark
88a4884 Merge pull request #4345 from jskeet/list-json-null
11d26ce Removed unused variables in repeated_scalar_container.cc
fcde518 Try using a new version of Visual Studio on AppVeyor
8b3a72f Removed unused code pertaining to shared_ptr
ec40953 Updated conformance failure lists
ec57f51 Added map_lite_test.proto to fix LiteTest
c5fcce5 Added pyext/thread_unsafe_shared_ptr.h
3fa5dad Removed unrecognized option from no_package.proto
00b1c14 Added new test source files to Makefile.am
616fe05 Removed use of some type traits
d6323c8 Change to deal all messages in one loop
d6a17aa Merge pull request #4397 from pherl/catlog
4880027 Cat the test-suite.log on errors for presubits
773d8c3 Fix bug: whether always_print_enums_as_ints is true or false, it always print the default value of enums as strings
ab95b1b Merge pull request #4371 from Rasrack/gnuc_minor
aae10ed Merge pull request #4370 from felixjendrusch/objc-output-stream-write-check
813848d Merge pull request #4222 from JohnHBrock/increasing_max_csharp_message_size
d5f5725 Merge pull request #4310 from KindDragon/patch-1
837c94b Include no_package.proto in Python test
67952fa Deleted scoped_ptr.h
b1216d9 Updated checked-in generated code
64a5c80 Fixed up proto3_lite_unittest.cc
501c13f Rewrite go_benchmark
afe96b6 Merge branch 'master' into down-integrate
0400cca Integrated internal changes from Google
89b5333 Merge pull request #4378 from acozzette/node-buffer
4f11025 Merge pull request #4167 from mike9005/patch-1
6d2c6a0 some fix
5487d8c Merge pull request #4380 from mateuszmatejczyk/patch-1
4787519 Merge pull request #1333 from cgull/pkg-config-issue
294b575 Output *_pb2_grpc.py when use_grpc_plugin=True
0e34c86 Fix spelling error of __GNUC_MINOR__
f8005a5 Revert "Removed mention of Buffer in byteSourceToUint8Array"
8e44a86 Merge pull request #4347 from xfxyjwf/pluginpb
6ebfa14 Merge pull request #4375 from jo2y/protoc-default
950f5e4 Replace //:protoc and similar default macro arguments with @com_google_protobuf prefixed versions. This allows them to work in 3rd party repositories.
6dd563a Sync upb change (#4373)
1da9ffe Check return value on write of raw pointer
38508e9 Add test for failing write of raw pointer to output stream
4e3c413 Add go benchmark
a48d58d Convert descriptortype to type for upb_msgval_sizeof (#4357)
0f4ad85 For encoding upb needs descriptor type instead of type. (#4354)
55d0758 Merge pull request #4355 from acozzette/typo
5004d09 PHP: fixed typo in message.c
fd595fc Revert "Move `compiler/plugin.pb.cc` to libprotobuf with the other WKT sources."
5140bae Add conformance test for null value in list JSON
822b924 Allow list values to be null when parsing
9dc0a4d Merge pull request #4183 from pcc/win-libcxx
325ecff Merge pull request #4333 from jmillikin/update-file-lists-needs-bash
32db4d3 Merge pull request #4334 from jmillikin/blacklist-internal-proto-srcs
3aaed96 Merge pull request #4195 from alexey-malov/IgnoreUnknownEnumsInJson
bb40c0c Merge pull request #4291 from google/3.5.x
350b135 Blacklist all WELL_KNOWN_PROTOS from Bazel C++ code generation.
724f0be Move `compiler/plugin.pb.cc` to libprotobuf with the other WKT sources.
2e4c52a `update_file_lists.sh` depends on Bash features, thus needs Bash sebang.
a6037c5 Merge pull request #4324 from abdul-sami/master
ac021a0 Merge pull request #1 from abdul-sami/abdul-sami-patch-1
fe33c5f Added instruction for existing ZLIB configuration
177108a Merge pull request #4323 from dtapuska/master
af3813c Rename a shadowed variable.
0475607 Merge pull request #3186 from gkelly/remove-unused-variable
22c477d Support using MSVC intrinsics in Log2FloorNonZero
02452db The JsonParseOptions::ignore_unknown_fields option behavior treats unrecognized string values in enum fields as default ones.
e34ec60 Only check filenames when end with .py in _CalledFromGeneratedFile() (#4262)
96b535c Merge pull request #4302 from BSBandme/down_integ_benchmark
6cd4ec4 Sync internal benchmark changes
4da7706 Remove stray indent on normal imports.
b8e4783 Merge pull request #4288 from nico/nofall
07f0231 Fix up the docs to mention the WKTs generated files also.
c66dd6c Remove use of GOOGLE_FALLTHROUGH_INTENDED from protobuf.
8b2ba9e Remove Google.Protobuf.Test/Program.cs from Makefile.am
a731ecb Adjusted appveyor batch
e823897 Updated NUnit packages, removed NUnitLite added packages for dotnet and Visual Studio, changed dotnet command in buildall to dotnet test, and deleted Program.cs (because it's no longer required).
82e0231 Merge pull request #4259 from Mizux/master
5dad7cc Merge pull request #4257 from davido/support_java9
9717d6f Merge pull request #4265 from BSBandme/upgrade_benchmark_submodule
1ec9beb Use NEW behaviour for project VERSION variables.
8dd0f4e Even with MSVC enable zlib support as default behaviour.
f7a0584 Add CMake ALIAS targets
3bc0282 Add VERSION property to CMake library targets
7f14915 upgrade submodule
4e2bd9e Merge pull request #4266 from brunokim/patch-1
0d397a1 Fix link markup in third party list.
6456e5d Merge pull request #4239 from mrpi/master
85b488f Bazel: Support building with Java 9
864df89 Remove 64MB memory limit when deserializing messages in C#
cf016a4 Work around strange error with atomic and swift under Xcode 8.3.3.
d570d48 Don't assume c-strings are 4 byte aligned.
d83837d Fix to use "nil" instead of "NULL" for objc objects.
81aeed0 Work around the static analyzer false report.
953adb1 Add casts to removed undefined behaviors around shifts.
b718551 Merge pull request #4249 from nlochschmidt/patch-1
204a641 Move kokoro macOS builds to to Xcode 9.1.
14e8852 Fix -fpermissive: '<::' cannot begin a template-argument list
69b1fdc Propose kotlinx.serialization as 3rd party lib
25a90e8 Merge pull request #3825 from ras0219-msft/patch-1
aaf41c6 Add Vcpkg to C++ installation instructions for Windows
1681fe6 Merge pull request #4196 from mathstuf/cmake-private-target-sources
f438ebd Merge pull request #4240 from davido/generate_warning_free_java_code
da6b07a Merge pull request #4209 from acozzette/using-statements
86fdad0 Java: Generate warning free code
39a789e Removed using statements from common.h
91317c2 Merge pull request #4236 from pherl/3.5.x
2a8e102 Bumping number to fix ruby 2.1 on mac
c236896 Merge pull request #4229 from leighmcculloch/patch-1
4a3b42e Remove broken link to code.google.com/p/protorpc
ed14fe4 Merge pull request #3934 from xfxyjwf/builtsources
0c52335 Update .NET SDK to 2.0.3
51293f3 Fix more memory leak for php c extension (#4211)
94f3be0 Merge pull request #4226 from themattchan/patch-1
3e1587f Add an explicit import of stdatomic.h.
6fd2ae7 Bring back import of OSAtomic.
72337d6 Add Haskell implementations
1d02e45 Merge pull request #4224 from davido/drop_java_6_support
019ceea Drop java 6 support
c337d9f Remove the use of BUILT_SOURCES
9ab859f Create std::string in Arena memory
80e016e Merge pull request #4205 from xuwei-k/patch-2
a721bf6 Migrate away from deprecated OSAtomic APIs. (#4184)
1c3b20b fix typo in FieldMaskTree.java comment
9d0a44c cmake: privately add sources to targets
cbdeb6a Merge pull request #4185 from pherl/ruby2.5
53d907f Update rake file to build of 2.1.6.
3ba21cd Add support for libc++ on Windows.
979c2cd Merge pull request #4182 from pherl/ruby2.5
d9f0f0b Support ruby2.5
47b7d2c Add DiscardUnknownFields support for C#
2a6eaeb Fix scope resolution for MessageExts in Ruby
9f80df0 Merge pull request #4158 from BSBandme/FixBenchmarks
473a810 Update py_benchmark.py
fa60e55 Fix java benchmark to use parser, fix cpp benchmark new arena to use Reset, format some files
b77aa80 Merge pull request #4148 from datacompboy/patch-2
d4afdba Merge pull request #4147 from datacompboy/patch-1
bab843b Merge pull request #4132 from BSBandme/JavaCaliperCounter
195253c Add counter to Java benchmark
4adc5a4 Merge pull request #4065 from BSBandme/python_benchmark_real
091eeb1 Update time_test.cc
473c5cf Fix ValidateDateTime: check day instead month
3823897 Well known types are not initialized properly. (#4139)
2fc69b1 Add python benchmark
5aa8f98 Merge pull request #4146 from pherl/fix_protoc
366192f Bump protoc-artifact version for a patch rebuild
a3868af Merge pull request #4131 from pherl/merge
eca1d2a Merge pull request #4116 from amandeepgautam/master
0c0d481 whitelisting aix platform as it has sched_yield
4588e6e Force a copy when saving the NSData that came from another.
43caa38 Merge pull request #4014 from BSBandme/JavaCaliper
ec826c5 Merge remote-tracking branch 'origin/3.5.x' into master
383a494 Merge remote-tracking branch 'origin/3.5.x' into master
39f577c Merge pull request #4124 from pherl/nullptr
5b1caea Merge pull request #4090 from pherl/nopassword
156161d Properly copy maps with string keys but pod values.
aca6c15 Fix some bug
4f3d865 remove nullptr
a147a21 Changed README
8529f2a Resolved issue #3510. Malformed errorr messages replaced with meaningful description
8fc40b5 Fix uploading binary wheel.
88e5573 Merge pull request #4089 from pherl/nocache
7ad8e7a Disable pip cache when testing uploaded packages
099d997 Merge pull request #4083 from matt-kwong/kokoro_jobs
0b2be3c Shard 64-bit Linux languages into different Kokoro jobs
4b2977b Merge pull request #4082 from matt-kwong/kokoro_jobs
ae49cfd Collect xml results for Kokoro
106ffc0 Merge pull request #4073 from pherl/changelog
d69f333 Merge pull request #4080 from pherl/arm64
6003a61 Make Kokoro job pull Dockerimage from Dockerhub
ad8a82e Add support for Windows ARM64 build
b5f09c1 Merge pull request #4077 from mkamilov/master
a5b743f Merge pull request #4030 from cyyber/master
7bf1e19 Update changelog
1e418e4 Merge pull request #4068 from wsw2016/fix_4032
ac1fdd1 line breaks adjsted
e68caa3 formatting issues
d106399 Merge pull request #4072 from google/jieluo
6c3c7f6 Merge pull request #4076 from pherl/stringback
b308580 Cherrypick for csharp, including: Add preserve UnknownFields Compare floating point values bitwise Add auto-generated header to C# generated files
19d0e99 Fix string::back() usage in googletest.cc
62616e1 Update changelog
96810de Merge branch '3.5.x' of github.com:google/protobuf into changelog
eff55ec Merge pull request #4074 from pherl/mapat
8ac050f Migrate Jenkins jobs to Kokoro
9ce29bd Merge pull request #4070 from pherl/3.5.x
501093d Replace C++11 only method std::map::at
7bd1606 Update changelog for 3.5.1
050fc9a Update version number to 3.5.1
ffa18ad resolve issue 4032 and added a unit test
03fb099 Add support for Windows ARM64 build
860d693 Add Xcode 9.2 to the testing support
3a06fe1 Fix file permission for python package.
77d32bc Merge pull request #4053 from xfxyjwf/fixumask
c79ba5c Merge pull request #4034 from TeBoring/php-timestamp-bug
269884a Merge pull request #4040 from bazurbat/3.5.x
0fc85ac Fix file permission for python package.
4b02091 add cpp
396336e Merge pull request #4046 from acozzette/deprecated-valueof-issue-2054
8d6f13e Fix for php5.5
1237c3f Merge pull request #4045 from pherl/deprecate
d971cbf Merge pull request #4049 from pherl/py26
39159c8 Accept DatetimeInterface in fromDatetime
0e2089c Calling Keychecker before checking key in MessageMap
acadade Remove py2.6 support.
594ec22 Fix python descriptor test.
cd32aae Merge branch 'master' of https://github.com/google/protobuf into JavaCaliper
75523ec fix bugs
1a549d9 Avoid using php_date_get_date_ce() in case date extension is not available.
0350111 Generate an annotation to suppress deprecation warnings
426cf6f Add auto-generated header to C# generated files (#4038)
34843ed Fix bugs
a4f68d4 Merge pull request #4044 from xfxyjwf/issue3268
00ac5c1 Merge pull request #4041 from acozzette/fix-license-issue-1775
22e1cfd Add deprecation annotation for oneof case.
b3ac441 Update generated code.
bfd254e  Add unknown field support for csharp (#3936)
50ef6a6 Avoid two consecutive underscores in macro name.
82724e2 Merge pull request #4042 from pherl/cpp_enum
8521624 Merge pull request #4043 from pherl/flush
0a7120a Merge pull request #4037 from xfxyjwf/issue2880
6b01e64 Explicitly propagate the status of Flush().
7ef21dd Use matching enum type for IsPOD.
ecab9c6 Added our standard license header to structurally_valid.cc and its test
edcf15e Create containing directory before generating well_known_types_embed.cc
5ce724b Merge pull request #4036 from xfxyjwf/issue3093
fffe8d3 Call php method via function name instead of calling directly.
0f9bfa8 Merge pull request #4016 from jquesnelle/string-access-ub
27e877f Merge pull request #2834 from aj-michael/master
6c27550 Clarify default value behavior in JSON conversion.
75eceb8 Update generated code.
8489612 Update comments for Timestamp JSON format.
f69a5db Merge pull request #4028 from TeBoring/3.5.x
88102ea Replace private timelib_update_ts with public date_timestamp_get
9f6acea Add PROTOBUF_ENABLE_TIMESTAMP to let user decide whether timestamp util can be used at install time.
5e732e3 Add caliper supported to java benchmark
b1386e7 Merge pull request #4026 from TeBoring/3.5.x
3b13c3f Add backslach to make class explict in global namespace
fc5818b Merge branch '3.5.0.1' into 3.5.x
31c54d1 Regenerated code from previous C# codegen commit
3e5bd2f C# code generation changes to use bitwise comparisons for doubles
f3e9a65 Compare floating point values bitwise in C#
618f06f Merge pull request #4007 from graywolf/patch-1
a426833 Merge pull request #4000 from Kwizatz/master
cf7c15e Fix ruby gc_test in ruby 2.4 (#4011)
9b09a2a Merge pull request #4017 from acozzette/update-file-lists
0fcca8f Merge pull request #4018 from acozzette/android-portable-log2-floor
f5b0862 use const char* instead of const std::string& in normalize()
63a0afc Use the portable version of Log2Floor for Clang with older Android NDK versions
0e7b589 Add discard unknown API in ruby. (#3990)
609d752 Ran update_file_lists.sh to update Bazel and CMake file lists
b32c2a8 fix undefined behavior in C++03
24493ee Using binary one's complement to negate an unsigned int
c370f88 Recursively clear unknown fields in submessages. (#3982)
1b5b3b2 Merge pull request #4013 from laszlocsomor/io_win32
a3a1c93 io_win32_unittest: remove incorrect error check
eb3bd6e io_win32_unittest: fix condition in GetCwdAsUtf8
3f1b1a6 io_win32_unittest: use CWD as last tempdir
57a01c7 io_win32: add more encoding-related tests
65da9fd io_win32: support non-ASCII paths
953a025 io_win32_unittest: make //:win32_test run again
457f6a6 Add release log
ba60b85 Update php c extension version number to 3.5.0.1
212563d Fix memory leak in php7
3b7a5f4 Fix several more memory leak
7d34371 Fix memory leak when creating map field via array.
e0d3aa0 Fix memory leak when creating repeated field via array.
de44982 Remove duplicate typedef. (#3975)
0316ae8 --pre is not necessary
9021f62 Merge pull request #3988 from acozzette/down-integrate
173f304 Merge pull request #3926 from BSBandme/down_sync_benchmark
e372df5 Fixed failing JS tests
db7c043 Merge pull request #3968 from fahhem/patch-2
35119e3 Add a check_protobuf_required_bazel_version() for use in WORKSPACEs
1fd6c17 Fix bugs to pass tests
7bb8584 Updated conformance failure lists
b140cb3 Fix memory leak when creating map field via array.
716acc3 Remove Xcode directives on some configs.
1acacf4 Fix memory leak when creating repeated field via array.
1c062a6 Sync internal benchmark changes
5d647e1 Updated Makefile.am to add a new file to EXTRA_DIST
0ba8eea Merge branch 'master' into down-integrate
92a7e77 Integrated internal changes from Google
a711e3d Merge pull request #3979 from acozzette/3.5.x-merge
a27da09 Merge branch '3.5.x' into 3.5.x-merge
94bb1ee Remove duplicate typedef. (#3975)
74e7dec Provide discardUnknonwnFields API in php (#3976)
6de51ca Merge pull request #3824 from anuraaga/dev_rag
da89eb2 Merge pull request #3955 from linux-on-ibm-z/master
6d60995 Update csharp version number (#3958)
0289dd8 Merge pull request #2519 from rubynerd-forks/ruby-fix-repeated-message-type-field
74f64b6 Fix JsonTokenizer exception message
3e944ae Add a UTF-8 decoder that uses Unsafe to directly decode a byte buffer.
3c6fd3f Merge pull request #3960 from acozzette/libprotoc-export-fix
1b1d1ea Added back in LIBPROTOC_EXPORT which was removed by mistake
34e30e5 Merge pull request #3962 from jleni/fix_dotnet
582d6ac Upgrading dotnet to 1.0.4
f2127b6 Merge pull request #3416 from xiaoshuang-lu/PROTOBUF-3404
642e1ac Adding Release_CompareAndSwap 64-bit variant
8ff2284 [PROTOBUF-3404] add --with-zlib=PATH to configure.ac script
cf65a79 Update version for 3.5.0.post1
f466709 Merge pull request #3941 from google/anandolee-patch-2
45d99a1 Add _file_desc_by_toplevel_extension back
f08e4dd Merge pull request #3919 from jart/less-warnings
b819abf Merge pull request #3918 from OEP/fix-sdist
ac5371d Remove unhelpful build warnings
9935829 Include .cc and .h files in source distribution
baed06e Small code reorder to maybe make #3893 happy.
6700f41 Travis config cleanups and move ObjC to Xcode 9.1.
2b3aa1c Add Setter/Getter type verification. (#3880)
8537f1e Fix up warnings from Xcode 9.1 (#3887)
98836a5 Update version number for php c extension (#3896)
e99e5d0 Merge pull request #3895 from pherl/3.5.x
1ec45f8 Add protobuf-all in post release
857a021 Use fully qualifed name for DescriptorPool in Any.php to avoid name (#3886)
0d46688 Update README.md: C extension works on PHP 7 (#3888)
696653d Merge pull request #3892 from sergiocampama/32bit
7daa320 Merge pull request #3878 from Yangqing/master
02129f0 Fixes 32bit tests.
cf68531 Merge pull request #3891 from thomasvl/travis_cleanups
7417755 Merge pull request #3883 from dmaclach/map_optimizations
8ae6844 codereview cleanup
6552c5a Merge pull request #3884 from dmaclach/unsafe
4ba3092 code review cleanup
af5ad24 Merge pull request #3882 from dmaclach/removeclass2
a839c67 Remove the allowed_failure for python_cpp as the bug was fixed.
2e17639 Remove the ruby tests from travis configs.
c46571b Update some comments about testing.
73e8c8a Instead of listing and then excluding osx builds, just don't list them.
949596e Simplify getter/setter method implementations
9d7f313 Reduce size of GPBDictionary by getting rid of class creation methods
37a6672 Remove unreferenced 'GPBMessageSignatureProtocol' class.
91ff83c Remove non-C# options from C#-only test protos
cba18ef Allow one to omit building libprotoc and protoc binaries
d3537c2 Merge pull request #3834 from sviterok/patch-1
0cd2ad1 Update protoc-artfacts
2720cdc Update README.md
4041600 Merge branch '3.5.x' of github.com:google/protobuf into 3.5.x
4493595 Update release date
2761122 Merge pull request #3868 from pherl/3.5.x
b9f891e Merge pull request #3875 from hchasestevens/add-hypothesis-protobuf-doc
188f180 All integer types should accept null in json. (#3869)
3c33143 Add hypothesis-protobuf library to the 3rd party doc.
8cf53f8 MMinor fix-ups to C# tests from changes in earlier commits
b5cdf0e Regenerated test code for C#
aa77eab Move C#-only test protos to csharp/protos
9a9a66e Run C# codegen when testing it
9c197b7 Support win32 long path for cross compiled build
fe2b4af Merge pull request #3867 from jtattermusch/update_changelog
07df230 update changelog
966a9ed Merge pull request #3861 from jtattermusch/backport_3858
5f96191 ParseFrom<T> for array slice is missing
4a5e1bd check already performed by MergeFrom
435f611 allow message parsing from an array slice
ce0a532 Merge pull request #3858 from jtattermusch/parsing_from_slice
5eb717c Fix arm64 name
b879abc Supports Arm64 (aarch64) protoc artifacts
30b6e54 ParseFrom<T> for array slice is missing
07542e7 check already performed by MergeFrom
0c874a6 allow message parsing from an array slice
0971efb Merge pull request #3854 from pherl/3.5.x
662e8b2 Provide util funtions to figure out correct php class names. (#3850)
181e284 Fix Atomic32/AtomicWord on some platforms.
1144768 Merge pull request #3835 from pherl/3.5.x
8a3c5cc Fix java code example
ce2d528 Changelog for 3.5.0
c258fb3 Merge pull request #3822 from mehrdada/update-benchmark-submodule
2df4726 Fix php well known type conformance tests (#3828) (#3840)
bcda919 Fix php well known type conformance tests (#3828)
239dba5 Merge pull request #3839 from thomasvl/message_equality
1f57e54 When comparing message, require them to have the same descriptor.
1908012 Update generated descritpors.
97dd175 Update version number to 3.5.0
da3bfa6 Fix a typo in WKT's test suite
cbe2505 Fix merging with message-valued oneof
bb35f04 Update google/benchmark submodule to v1.2
6dd8224 Merge pull request #3817 from xuwei-k/joda-url
05b56d0 update joda-time javadoc url
e8c9ae1 Add parser settings WithXyz methods
a985451 Add JsonParser setting to ignore unknown field values
4526d8b Merge pull request #3722 from timou/cmake-windows-clean
23adfeb Reserve unknown in Ruby (#3763)
a08b03d Add missing files
9aaa8e1 Merge pull request #3804 from pherl/merge
2fc7aea Merge pull request #3791 from signalwerk/patch-1
cdc0d95 Merge remote-tracking branch 'origin/3.4.x' into master
ee8a091 Merge pull request #3787 from sergiocampama/coverage
b1f954e Improves coverage of GPBCodedInputStream
44daa59 To be clear that we set a new variable
a23669c Sort MSVC warning suppressions
cefa9d7 Merge pull request #3758 from spinorx/3.4.x
b189389 Merge pull request #3757 from spinorx/master
09e0dbc Merge pull request #3743 from Schtolc/master
2a14214 Merge pull request #3754 from toanju/gcc-fallthrough
07b9238 Merge pull request #3770 from pherl/3.5-integrate
2ee294d Fix Java 1.6 compile
9c407a1 Merge pull request #3751 from uykusuz/master
ca6187d Merge pull request #3578 from pherl/filedeprecation
dedf904 Merge pull request #3764 from zearen/patch-1
37f984f Merge pull request #3698 from hesmar/hesmar/fixProtocIncludeDirs
3d6cc0e Merge pull request #3641 from drivehappy/3.4.x_clang_cleanup_3
2e2614e Merge pull request #3706 from johanbrandhorst/patch-1
188755c Fix JS conformance tests
1c682e0 Fix bazel build
ecf2957 Update descriptor protos
30bfe36 Merge pull request #3736 from jleni/fix_rbpi
6b5912b Merge branch 'master' of github.com:google/protobuf
1a7a7fc Merge from google internal
dc9190f Merge pull request #3769 from pherl/io_win32
cc58be6 Fix unsiged underflow
7dbee32 Remove C++11 only usages in io_win32 tests.
c4f59dc Merge pull request #3760 from jmillikin-stripe/descriptor-memset-ub
aff1097 Fix undefined memory management found by Clang's sanitizers.
3130ce0 Fix iOS cc_library build for protobuf.
16792c6 Fix iOS cc_library build for protobuf.
37e112f fix implicit fallthrough in gcc 7
be13314 fixes issue #3750
f850188 Merge pull request #3744 from fmarier/json-escaping-namespace
5992e24 Move namespace closing brace inside the header guard block
a632f0d Merge pull request #3739 from pherl/merge3.4
1e58006 test for field reassignment
38fd94e CodedInputStream::SetTotalBytesLimit description fix
dd980cc Fix distcheck
de15e73 Merge remote-tracking branch 'origin/3.4.x' into master
08334f0 Converting to immutable hashable types
c4083bb Merge pull request #3735 from sgreenstein/patch-1
68ee916 Don't pass -lpthread and -lm on Windows
6032746 Reserve unknown fields in php (#3659)
d2a5f8b Update third_party.md
2a72840 Suppress VS2017 compiler/linker warnings
77f64bb Add well known types to php runtime. (#3697)
cd5f49d Fix ruby segment fault (#3708)
d6c32a8 Merge pull request #3714 from thomasvl/objc_increase_test_coverage
a274c67 Build out more complete code coverage in the tests.
9477123 Let Xcode 9 update project/scheme settings.
4207066 Merge pull request #3710 from thomasvl/xcode9
c4dce01 Merge pull request #3709 from thomasvl/unknown_field_merge_issue
b586e64 Add Xcode 9 support to the helper script.
3f2dcae ObjC: Fix merging of length delimited unknown fields.
210be26 Use constexpr more with VC++ 2017 (#3707)
fc7a6a2 Add GopherJS protobuf and gRPC links
bd798df Merge pull request #3690 from pherl/3.4.x
a38f876 Merge pull request #3691 from pherl/stringback
f7e2099 protobuf_generate: create include path only for proto files
d2738c0 Add spaces
5a501c6 Fix C++11 string accessors
6d0cf1b Remove ranged based for in io_win32.cc
fc5aa5d Merge pull request #3676 from hesmar/hesmar/fixProtobufGeneratePython
0e069e5 generate python code when calling PROTOBUF_GENERATE_PYTHON
ae55fd2 Enforce all error report for php tests. (#3670)
c204402 Merge pull request #3675 from hesmar/hesmar/cmakeAddDllExport
9829b8f protobuf_generate: add EXPORT_MACRO option
c627530 Merge pull request #3674 from pherl/shutdown
4fc7529 Merge pull request #3627 from zanker/zanker/add-submsg-hash-init
b091bfb Test Shutdown can be called multiple times.
633ef8b Update message.c
2b0ee3f Add $ before url_prefix_len to make it a variable. (#3668)
eade82c Merge pull request #3639 from zanker/zanker/fix-embedded-to_h
8eae3fe Update message.c
8771483 Allow initializing a chain of protos using only a hash
83264bd Fixed to_h with repeated messages to return hashes in Ruby
cf1b29d Merge pull request #2377 from mcos/chore/conformance-null-tests
fa5a69e Merge pull request #3624 from acozzette/down-integrate
655cc83 Merge pull request #3651 from pherl/3.4.x
2eb1bac Bumping minor version for ruby gems
5dd818c Merge pull request #3612 from TeBoring/php-bug
b04e5cb Merge pull request #3642 from pherl/3.4.x
dba647a Bump version for minor release
e3be1fe Clang warning cleanup for unused parameter.
13fd045 Integrated internal changes from Google
d1bc27c Merge pull request #3626 from xfxyjwf/fixgo
8136ccb Fix go example test.
c0d88ae Merge pull request #3635 from drivehappy/clang_cleanup
7f3ded6 Clang warning cleanup for unused parameter.
471b45e Merge pull request #3158 from yeswalrus/fix-policy-warning
3d78561 Merge pull request #3634 from TeBoring/ruby-bug
a459b22 Storing the frame on the map means we don't need the array
c1dd8e8 Move parse frame array to the Map object
8741da3 Revert "Fix js conformance tests. (#3604)" (#3633)
a425dd9 Rename ClassNamePrefix to ConstantNamePrefix
2bd55a9 Fix js conformance tests. (#3604)
f46a01d Exclude valid constant name from reserved name.
06aa8dc Merge pull request #3621 from jtattermusch/upport_3596
ed0a07e Merge pull request #3618 from hesmar/fix_protobuf_generate
5de0565 Google.Protobuf should target net45
444aecd fix protobuf_generate function
b1befb0 Merge pull request #3613 from xfxyjwf/bazel_examples
6945203 Exclude addressbook.proto from C# boostrap test.
ddb9ef9 Change array to map for reserved names in c extension
49b31dc Update C# generated file for addressbook.proto
49b88af Update examples file list.
89069de Change array to associate array.
174c82d Add well-known timestamps to JSON for PHP (#3564)
74bf45f Add bazel support for examples.
e5d000c Add prefix to php reserved keywords.
2ad5c0a Merge pull request #2576 from cristicbz/py-strutil
6a4ffb2 Merge pull request #3596 from jtattermusch/csharp_target_net45
054054c Merge pull request #3590 from NanXiao/patch-1
7f8b91f Add native php support for Duration. (#3583)
35b852f Merge pull request #3594 from buchgr/well-known-protos
699c0eb bazel: Add proto_library rules for well known types. Fixes #2763
50a6475 Google.Protobuf should target net45
f4ff17b Update autogen.sh
6699f2c Merge pull request #3560 from tenderlove/thread-safe-map
f9b8169 Add TODO
dd69d5c Fix dist check
baae7ea Add @Deprecated annotation support for proto file.
b70e0fd Add php support for Timestamp. (#3575)
2807436 change the field number of php_generic_service to fix the conflict with (#3576)
f55c6ec Storing the frame on the map means we don't need the array
d6152dd Move parse frame array to the Map object
c7457ef Add any support in php runtime. (#3486)
21b2372 Merge pull request #3565 from pherl/fixdist
d8c6193 Add mising cmake files in dist
98a3734 Merge pull request #3503 from gburgessiv/master
859d94a Merge pull request #3544 from anandolee/master
92ea0d2 Merge pull request #3556 from matt-kwong/kokoro_mac_v3
78432ea Remove pre-installed softwares
07de70e Merge pull request #3555 from pherl/fixdist
364060b Merge pull request #3547 from matt-kwong/kokoro_mac_build
55fdbe5 Make distcheck aware of test proto files.
e46caba Remove pre-installed softwares
c44c3af Merge pull request #3548 from google/3.4.x
30681c7 Merge pull request #3546 from pherl/deathtest
610e433 Drop python2.6
028d6f1 Add Python 3.5 3.6
6609e52 Disable death tests on windows
e416f5d Merge pull request #3537 from TeBoring/php-bug
7273b3c Merge pull request #3539 from drivehappy/3.4.x_clang_cleanup_1
09fd125 Merge pull request #3540 from drivehappy/3.4.x_clang_cleanup_2
7a43137 Merge pull request #3543 from tony612/patch-1
6cecd20 Work around a bug in clang's static analyzer
2fb7479 Add Elixir protobuf and gRPC to 3rd party doc
c27b56c Merge pull request #3494 from drivehappy/clang_warning_macro
36ac06f Merge pull request #3535 from drivehappy/clang_warn_cleanup
0b7e978 Merge pull request #3535 from drivehappy/clang_warn_cleanup
92d768c Merge pull request #3536 from pherl/io32_11
77a453a Fix compile errors
dd51909 Use message name as defined in php runtime.
4aadcd3 Remove C++11 features in io_win32.cc
ba4e547 Merge pull request #3529 from pherl/merge3.4.x
472f700 remove the parens from the cmp() lambda definition (#3526)
1183f48 Fixing unused parameter warnings under Clang.
139775c Merge remote-tracking branch 'origin/3.4.x' into mergemaster
26ac3e8 Merge pull request #3528 from pherl/rubyfix
fdb5cd5 Merge pull request #3504 from pherl/3.4.x
d76e4c7 Bump gemspec again
1825d6d Merge pull request #3317 from ejona86/protoc-artifacts-jdk8
c2aa26e Revert "Drop Python 3.3 from testing & add Python 3.5, 3.6 (#3512)" (#3524)
703f414 Drop Python 3.3 from testing & add Python 3.5, 3.6 (#3512)
a04eb8c Define cmp() for Python 3 (#3517)
1aa2c34 Merge pull request #3516 from cclauss/patch-3
dded80f define long() for Python 3
6f4c9b0 print() function for Python 3
a1acf25 print() function and lose the semicolons (;)
067543c from __future__ import print_function
7daedbd print() function & define raw_input() for Python 3
45483fd file() was removed in Python 3, use open() instead
5ab8ae7 Merge pull request #3511 from cclauss/patch-3
fa46d11 Merge pull request #3514 from pherl/rubyfix
29c24ab Merge pull request #3502 from pherl/pypi
1926636 Bump gem version for the next upload
958412e Old style exception --> new style exception
d1484cd Update CHANGES.txt
55d938c Update CHANGES.txt
98c6d04 Prefer system distributed binaries/libraries.
e243082 Update php version number to 3.4.0
82dd4b3 Update php c extension version number.
5d5df84 clean up
19a7e20 Update testpypi addresses.
80a37e0 Merge pull request #3495 from pherl/c++11
5e39ecc Merge pull request #3494 from drivehappy/clang_warning_macro
3d2f72b Merge pull request #3496 from pherl/mingw64
6654c8d Update readme
f619621 Merge pull request #3493 from pherl/cmath
d909834 static link for 32 bit build as well.
f7b3dd4 Update comments that cross compile is feasible now
fa086c8 First try static linking pthread
4a4c67b Add std::forward and std::move autoconf check
a23e198 Fixing warning under Clang 6.x (-Wexpansion-to-defined) where the macro expansion producing 'defined' was warning on undefined behavior.
3908b4e Fix cmath/math.h include with non C++11 libstdc++
1cc4ecd Merge pull request #3488 from pherl/changelog
3ae8c4c Update changelog for 3.4.x
eaeca0d Merge pull request #3485 from pherl/mingw
1bd2d1f Merge pull request #3484 from pherl/qualifier
651429b Fix comments
31b3bd9 Make compilers without ref-qualifier support happy.
b4c0cfe Add malloc cast
fd31899 implement remove strdup usage and implement our own
34ebca0 Adding missing imports for strdup
40d1855 Fix mkdir
564f02c Make win32_io only for MSVC
7afa796 Fix the declaration order in ming32
db125b8 Fixing io_win32 for MinGW32
e0d24cc Detect invalid tags with a field number of 0 in C#
d2f3adb Merge pull request #3481 from pherl/nowarning
ccb6b62 Merge pull request #3480 from bklarson/master
1b42347 Clean up typedefs for Atomic32/Atomic64
e8fc066 Make no warning test stricter.
9adb4e8 Merge pull request #3478 from pherl/warning
97d50e3 Make code free of missing-field-initializers warnings
35db267 Merge pull request #3473 from AlanBurlison/master
80e984e Merge pull request #3467 from thomasvl/bump_xcode_version
a68a800 PROTBUF-3394 Potential SIGBUS with UnsafeUtil.getLong
3afcded Merge pull request #3461 from TeBoring/3.4.x
176713d Merge pull request #3468 from vladmos/3.4.x
fae3816 Merge pull request #3454 from anandolee/master
06a825c Make .bzl files compatible with future versions of Bazel
d08c291 Merge pull request #3465 from vladmos/list_plus_equals
fe68821 Move travis to the Xcode 8.3 (8.3.3) image.
ba81c59 Fix up Xcode 8.3.x support.
4fc9304 Make .bzl files compatible with future versions of Bazel
8f4b8e4 Merge branch 'master' into 3.4.x
f14703c Update commit id in Dockerfile to reflect change in #3391 (#3459)
49b44bf Fix the bug in php c extension that setting one field can change anotherĀ field's value. (#3455)
21b0e55 Update PHP descriptors (#3391)
f5817b3 PY26 tests compatibility 1, Some tests in reflection_test PY26 raise TypeError but other versions raise ValueError for convert negative long to unsigned 2, Change compare exception type to compare exception str for testDuplicateExtensionNumber. Original code raise 'Double registration of Extensions' is not an instance of (<type 'exceptions.AssertionError'>, <type 'exceptions.ValueError'>) for PY26 cpp implementation
1ab5adb Merge pull request #3456 from giorgioazzinnaro/patch-1
a3e1752 Update third party addons with ProfaneDB
9150cd8 Skip setUpClass which is newly added in python2.7 for python2.6
d58df3b Add python 2.6 test back for cpp implementation. Json format issue was fixed in #869
c2f69d6 Merge pull request #3450 from pherl/invalidoffset
e243fd8 Merge pull request #3451 from pherl/fixtypo
39a91d3 Fix typo.
4cbbf33 Fix invalid offsetof warning.
25672c1 Add getClass for php Descriptor in c extension (#3443)
7781784 Add destructors for default instances to the shutdown code. Verified test succeed under draconian heap checker
53ae6de Merge pull request #3442 from pherl/csharpversion
6c64f11 Bump csharp version
e0288e7 Merge pull request #3438 from pherl/changelog
9df89cc Fixing HHVM Compatibility (#3437)
c15a326 Expose descriptor API in php c extension (#3422)
8a5208f Update change log
502624f Merge pull request #3419 from pherl/3.4.x
d77c8c5 remove the duplication
bcf1733 Adding the missing header
ebeb472 Export functions in io_win32.h in win DLL build
be73938 Change divideInt64ToInt32 to static (#3436)
d32f5b4 Removes unnecessary pass-by-references in PHP internal classes (#3433)
0724314 Merge pull request #3429 from king6cong/master
d6bd959 Merge pull request #3406 from ax3l/bg-3.4-cmake-pkgconfig
547d76e Add classpath for java example Makefile
9b505a9 Merge pull request #3421 from thomasvl/update_comment
9a4692d Update the comment on the message_type to cover what it should be.
5eb95ef Merge pull request #3420 from thomasvl/objc_proto2_conformance
3caf9fd Review feedback.
c2831a3 Add the proto2 message conformance support for ObjC.
9fd5e59 Generate the proto2 test file and link it in for ObjC.
21800ff Add a objc_class_prefix to test_messages_proto3.proto.
19b8c8b Update conformance tests again.
8d5f2c5 Merge pull request #3410 from adam-26/1745
12c186f Fix makefile.am
de6debc Fix conformance tests
e177739 Fix build files
9b8f658 Remove dependency on guava 20
d974cc2 Update conformance tests
f39d4de Merge remote-tracking branch 'origin/master' into merge
759245a Merge from master
bb86644 need for php math functions. used in mergeFromJsonString (#3409)
e9f4e8e Merge pull request #3407 from bklarson/master
451d061 Fix cycle dependency for repeated field not collected by gc (#3399)
4bff88e Merge pull request #3413 from pherl/3.4.x
ce44167 Merge remote-tracking branch 'origin/3.4.x' into vb
4ae94d6 Merge pull request #3414 from pherl/fixzip
eb6b332 Merge pull request #3412 from anandolee/3.4.x
02250a7 Omit the zip test if tools are not available
9c012ed Add __bool__ as well as __nonzero__ for python3
11b4d5e Update required version on pre-generated files
8b24feb Update pre generated files
7bb39be Update version number for 3.4.0
a484794 Use keys() instead of iterkeys() to be python3 compatbile.
72cc5a6 Merge pull request #3393 from pherl/3.4.x
b6da226 Put AddDescriptorsImpl() in anonymous namespace
a713b73 Merge pull request #3281 from BSBandme/ConformanceTestYilunChong
2b6db00 Fix quotation marks
a3a65b3 Fix issue #1745 - javascript allow dot in filename
f15185d Merge pull request #2969 from laszlocsomor/master
0d9a34c Add -Werror=missing-declarations to test builds
ebe659d Travis: Exclude CMake .pc files
2f3cf52 CMake: Install .pc Files
eef2edc Merge pull request #3403 from ax3l/topic-cmake-pkgconfig
74dcf44 Travis: Exclude CMake .pc files
668712c CMake: Install .pc Files
e8a32d0 Merge pull request #3401 from aschrijver/patch-2
d640cdd Updated outdated hyperlink
062df3d Merge pull request #3179 from bjwatson/fix-duration-typo
040f56e Merge pull request #3397 from acozzette/initialization
bba4c4a Merge pull request #3262 from snnn/master
417aae6 Fixed dynamic initialization for C++ lite
dd091aa Fix code to use values() instead
aa61bb0 compiles removal of newline (#3333) (#3370)
1876a27 Update filelist again
66c1dac Add lite and python extra_dist files
bf658c2 Add java and JS dist files.
988376c remove broken imports in JS
b07cdb6 Use itmes() instead of itervalues() to be python3 compatbile.
e45214e Fix distcheck
8084e03 remove profile
11b6661 update build file list
7986ca7 Update conformance tests
50aa4fe Merge pull request #3375 from TeBoring/3.3.x
3af881c Merge master into 3.4.x
8697530 Update csharp and php descriptor
09354db Merge from Google internal for 3.4 release
942a29c Merge pull request #3390 from danielgtaylor-isp/patch-1
6ec0b7e Merge 3.3.x into master
3370299 Merge pull request #3385 from anandolee/master
4b5b1e6 Add note about includes to README
9e745f7 Support PHP generic services (#3269)
b764e67 Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION for pure python comformance test
3354558 Merge pull request #3348 from matthauck/fix-gcc41-again
dd6e420 Merge pull request #3357 from bklarson/master
eb8e3d3 Merge pull request #3134 from lundefugl/javabug1
aaa2550 Merge pull request #3372 from dylanetaft/master
324b20a remove pass by reference for php setters (#3344)
29ff49f Fix Implicit Return Types (#3363)
36fcc2a Expand documentation in Readme.md
c78dbd7 Initial value in generated code cannot be used by c extension. (#3367)
ec3f5dc removes an accidental newline in printing for the php generator (#3333)
3a0382e Add map iterator for c extension (#3350)
d3bbf1c Add space between arrow and casted type (#3353)
6aa4b20 Merge pull request #3327 from htuch/fix-3322
71f2f0d Fix repository URL in C# project file
81142e1 Fix build when using -Werror=undef
727c0dc C#: Implement IReadOnlyDictionary<K,V> in MapField<K,V>
2130fea Fix map_field_inl.h for gcc 4.1
e05e777 Windows: support long paths
9ab7c73 Fix missing std::tr1::hash on GCC 4.1 (#2907)
126082c Add std:: namespace prefix to set and map (#3332)
b9c4daa Uncomment php tests (#3301)
f6ff32c Use consistent hash across NDEBUG/!NDEBUG builds.
bd5ab15 Merge pull request #2482 from andreaseger/fix_ruby_timestamp_accuracy
a9e7a15 protoc-artifacts: Bump JDK to 8u131
bceb830 add comments in makefile.am
6bd51a5 add Grpc Protobuf validation (#3311)
32c8ed3 change csharp failure list
30b4194 Merge branch 'ConformanceTestYilunChong' of github.com:BSBandme/protobuf into ConformanceTestYilunChong
7339c25 delete backup files
cbf7dfb fix php failing list and csharp generated proto
e264b20 Merge pull request #3315 from thomasvl/mutate_unknowns
b30dee3 Expose the initializer for unknown fields.
cdd524a Ensure leaveOpen is true when writing to a buffer
62d7fe5 Make Any easier to work with in C#
e82ba0b Merge branch 'master' into ConformanceTestYilunChong
726ba33 changed php's failing list
5085102 remove backup files
3adb054 add some test proto2 supported, add js proto2 supported, fixed some error
ecca6ea Add json encode/decode for php. (#3226)
a7d5be6 change php objc nodejs csharp ruby
fcb9268 change java to uniform message, revert TestValidDataForType's parameters
020a24d change cpp and python to uniform message
5a52b35 Merge pull request #3287 from sergiocampama/initialized
e55782f Add initialized as a reserved keyword as that's the actual property name
467c1cc fix csharp conformance test
db379e6 fix csharp conformance test
176bac6 Add scripts to build python wheel for linux. (#2693)
ff773c1 fix csharp
cd1dc3f add csharp support
3645021 add message set test case
06c9057 add objec support
5e7e2d3 revert ruby proto built files
cf7b6a4 delete backup files
6c59c25 delete binary
91da852 update .gitignore
0255431 revert .gitignore
696cf77 add java supported
0fa4e58 change ignore
18a0c2c add proto2 supported for cpp,python,nodejs,ruby,php
12acbc2 adds PHPDoc @return and @param for getters and setters respectively (#3131)
2ad74e1 add support for proto2
097bfb5 Merge pull request #3084 from lukaszx0/patch-1
dd7265e Merge pull request #3264 from TeBoring/php-bug
e3c807d Fix more implicit type conversions in public headers and generated code.
1a7e49d Merge pull request #2968 from ngg/cpp-proper-fwd
9c0b35c Enusre public header and generated code have no implicit converion.
4e67590 fix readme.md
58a3c74 add test_proto2_message.proto and change conformnace/makefile.am
f752d81 Merge pull request #3266 from mbrickn/patch-1
d07efba Updated links to use https
eca0f4e Merge pull request #3261 from thomasvl/super_oddcase
db45687 If we fail to get a descriptor just super the method resolving.
4987dda Fix a bazel build error on Windows
5555d3b Fix typos in comment
5532abc Merge pull request #3258 from TeBoring/3.3.x
5520b43 Update C++ generated code.
e7bcfc4 Update version number to 3.3.2
8ecae34 Merge pull request #3255 from TeBoring/3.3.x-3
3b1a875 Remove inclusion of ext/json/php_json.h. (#3241)
c344fe8 Oneof field should be serialized even it's equal to default. (#3153)
dba8928 Add ARRAY for reserved name (#3150)
703cd8e Switch to addEnumType to fix fatal error (#3225)
1325588 Updated upb to fix JSON conformance issues. (#3206)
c2fdef0 Merge pull request #3243 from yjjnls/master
73b7cc0 Merge pull request #3244 from thomasvl/complete_docs
5fd71ce ObjC: Document the exceptions on some of the writing apis.
72e293a Merge pull request #3240 from thomasvl/float_fun
8f367c0 replenish missed header files in install step
5729cf7 Remove inclusion of ext/json/php_json.h. (#3241)
dd19b87 Raise the number of digits used for floats.
710543d Merge pull request #3237 from calder/patch-1
491b320 Merge pull request #3236 from buchgr/bazel-links
888e287 Merge pull request #3235 from buchgr/java-target
4b36d40 Qualify string in java_options.h
36e63da bazel: Make compiled jars java 6 binary compatible.
d0e6f3b bazel: add bazel symlinks to .gitignore
91bf623 Fix php jenkins test (#3233)
8d97b3d Fix incorrect function call (#3232)
b9b34e9 Follows proper autoloading standards (#3123)
09d2994 Merge pull request #3228 from thomasvl/add_tvos_to_podspec
6ecf38f Add tvOS to the podspec.
c722c3d Merge pull request #3216 from traversaro/patch-1
9094bf0 Export symbols used in inline functions
9ba7d1c C++: Do not forward-declare dependencies in generated .h files
96095f3 Merge pull request #3176 from acozzette/fix-3114
c202b06 Merge pull request #3196 from matt-kwong/kokoro
0871e6a Add continuous testing config files for Kokoro
d4d41af Merge pull request #3191 from matt-kwong/kokoro
6156af1 Add MacOS and Linux tests to Kokoro
d555775 Merge pull request #3189 from thomasvl/objc_proto3_unknown_fields
2aeb4ab Merge pull request #3160 from meteorcloudy/winbuild
ddbe360 Merge pull request #3159 from yeswalrus/new-generate
1d0988b ObjC: Preserve unknown fields in proto3 syntax files.
516a81a Merge pull request #3190 from thomasvl/objc_IllegalZeroFieldNum
ecc0f54 Properly error on a tag with field number zero.
656dedb Merge pull request #3157 from yeswalrus/fix-version-check
7ccb251 Remove unused output_file variable from js_embed
6f32580 Add new file option php_namespace. (#3162)
df3f8cf fix check_and_save_build_option not correctly exporting build options
0336770 add protobuf_generate function, allows use of target_sources where available
e9c15d6 Ensure that for Java, imports of .proto files with empty packages works
fbaad36 Merge pull request #3169 from dmaclach/master
63a9728 Merge pull request #3170 from thomasvl/int64_map_issue
46f36d7 Fix some cases of reading of 64bit map values.
ea43e0c Optimize GPBDictionary.m codegen to reduce size of overall library by 46K per architecture.
0b059a3 Refactor cc options in BUILD file for Windows
d8c5865 Fix policy warning CMP0054
a183a0d Fix the check_and_save_build_option macro never evaluating to true
faa5398 fix check_and_save_build_option not correctly exporting build options
ae85cb8 Fix find module not working when no version number was given
d6470ab not to use std::random_device for map.Seed(). (#3133)
e222997 Merge pull request #3149 from KarrokDC/master
1e86ef4 Oneof field should be serialized even it's equal to default. (#3153)
282fb9e Add ARRAY for reserved name (#3150)
4d5daf4 Adds fluent setters for PHP (#3130)
4eb02fe Add headers as part of cmake project tested only on windows with visual studio 2015 as generator
4674cc7 Merge pull request #3113 from phst/master
95749d5 update csharp README and fix .NET 3.5 build error
aea4374 Issue 3112: Object class with fully qualified name
0b07d7e Add IncludeSource in csproj as per review comments
f26e8c2 Convert C# projects to MSBuild (csproj) format
40da1ed Removing undefined behavior and compiler warnings (#1315)
ba987a7 Merge pull request #3126 from mbrukman/fix-readme-formatting
c5125f3 Merge pull request #3117 from KarrokDC/master
d2c1865 Merge pull request #3103 from sergiocampama/perf
2465ae7 Adds serial and parallel parsing tests to check if parallel parsing is faster than serial parsing, which it should
6775570 Fix Markdown formatting in README.
979107e Improve fix for https://github.com/google/protobuf/issues/295
3b22761 show help if protoc is called without any arguments, pre-empts -h and --help to show a useful message instead of just 'Missing input file.'
bf04c83 Merge pull request #3085 from scpeters/issue_3059
8546620 Merge pull request #3104 from thomasvl/ext_registry_copy
49e4ba6 Fix ExtensionRegistry copying and add tests.
b28617b Merge pull request #2815 from devwout/ruby_json_emit_defaults
78cb804 change test for nanosecond accurate timestamps
ad203bc fix floating point accuracy problem in Timestamp#to_f
49a56da Update jenkins Java deps.
129a6e2 Revert guava depedency to version 19.
969e0be regenerate plugin and profile message code
26f0011 Use bool deterministic to suppress warning
474cca5 Add LICENSE in package.xml (#3083)
13f532e Merge pull request #3074 from xfxyjwf/3.3.x
82e50ba Workaround the docker bug when compiling artifacts
de6ae7d Fix upb load descriptor when no messages defined in prorto. (#3080)
2231931 Fix c extension for php7.1. (#3077)
757cc9f Update C++ generated code.
58538ea Update version number to 3.3.1
c2154e1 Merge pull request #3073 from xfxyjwf/3.3.x
d22493b Merge pull request #3064 from randomguy3/offset-type
9b82fce Workaround gcc < 4.5.0 bug
455b61c Merge pull request #3062 from Oppen/master
e062f70 Fix compilation
e82d81a Fix offset type to match the tables it is used in
25abd7b Add compatibility test for php. (#3041)
3c369dc Merge pull request #3057 from xfxyjwf/3.3.x
cd0efc0 Workaround gcc < 4.5.0 bug
99cf2ef Merge pull request #3056 from acozzette/cherry-pick-pr-2873
7378ec2 Add missing LIBRPOTOC_EXPORT.
3a5a072 Skip C# test in C++ only distribution.
8859c07 Add missing files to build files.
4833960 Merge pull request #3043 from acozzette/javascript
22c8772 Fix #1562 by using goog.crypt.byteArrayToString instead of String.fromCharCode.apply
dd0a233 Merge pull request #3055 from chrisn-arm/3.3.x
c3093d3 Fix issue 3046: compilation on alpine 3.5
f00e06c Removed mention of Buffer in byteSourceToUint8Array
a64497c Merge pull request #2873 from myitcv/fix_1562
bcb3506 Fix #1562 by using goog.crypt.byteArrayToString instead of String.fromCharCode.apply
2f4489a Merge pull request #3024 from acozzette/merge-3.3-to-master
286f059 added "objectivec" build target (#3033)
9053033 Merge remote-tracking branch 'remotes/google/3.3.x' into merge-3.3-to-master
067b1ee Merge pull request #3023 from acozzette/min
07c284f Fully qualify min as std::min in wire_format_lite.cc
a6189ac Add prefix to enum value with reserved name. (#3020)
cbd08cb Merge pull request #3018 from acozzette/using-namespace-std
54d1701 Merge pull request #3015 from buchgr/unused-consts
7c76ac1 Remove "using namespace std" from stubs/common.h
3c0855e Add a test case for nested enum, which was missed previously. (#3010)
b1c75bc Remove unused constants.
4920e27 Merge pull request #3008 from postmasters/patch-1
fba2acd Add nested enum descriptor in php rumtime. (#3009)
e64b618 Update php version number to 3.3.0 (#3001)
4777574 Add a link to dart-lang/protobuf
6fff091 Throw exception when parsing invalid data. (#3000)
f418b9e Merge pull request #2996 from xfxyjwf/3.3.x
4523c9c Allow proto files to import descriptor.proto (#2995)
478119f Fix python3 issue.
14afc3f Merge pull request #2992 from xiaogaozi/patch-1
f85eecb Add gogoprotobuf to third-party add-ons list
4c57e84 Prepend "PB" to generated classes whose name are reserved words. (#2990)
b97cd57 Add test for nested enum for php (#2989)
7be0882 Enum defined without package have incorrect class name. (#2988)
190b527 Make PHP c extension work with PHP7 (#2951)
357afc3 Merge pull request #2508 from yliu120/pass_default_env_to_protoc
0a93f67 Merge pull request #2987 from konsumer/patch-1
594f810 Merge pull request #2982 from mda000/issue2972
3055a02 Add node-protoc-plugin to "Other Utilities"
a3873ca Merge pull request #2985 from thomasvl/class_check_tweaks
f5a01d1 Tighten up class usage/checks.
2240a78 Simplify the Element dtor invocation when freeing elements in InternalDeallocate to avoid confusing the compiler when there's a class named Element already defined in the global namespace.
8aa927f Merge pull request #2950 from anuraaga/dev_rag
4323482 Merge pull request #2967 from xfxyjwf/3.3.x
5777259 Cherry-pick cl/152450543
cad0258 Cherry-pick cl/151775298
fc3ea97 Merge pull request #2955 from xfxyjwf/3.3.x
899460c cherrypick descriptor_pool.FindFileContainingSymbol by extensions (#2962)
bfeeb98 Add include for INT_MAX
e91caa1 Merge pull request #2949 from xfxyjwf/3.3.x
bf483df Allow unknown values for Map put*Value methods just like every other enum mutation method.
ee9c7f1 Cleanup reflection objects for map entry.
efec757 Merge pull request #2937 from anuraaga/dev_rag2
18c13c9 Merge pull request #2942 from xfxyjwf/3.3.x
21b0b3c Update generated code.
80f0c0a Update version number and changelog for 3.3.0
69bfde2 Merge pull request #2922 from anandolee/master
09328db Fix test for unexpected type url when parsing Any. Currently, the test fails since TestAllTypes doesn't have field '@type', which is the same test as testUnknownFields.
139fd0a Merge pull request #2933 from mharrend/patch-1
37c7b76 Merge pull request #2930 from anuraaga/dev_rag
662f978 Fix duplicate fields test. The repeated version is passing because null values in a repeated field are rejected and not testing what it wanted to. Also adds a oneof version that verifies the case of oneof fields of different names (currently only same name check seems to be tested).
cc3fa2e Merge pull request #2676 from acozzette/js-compatibility-tests
10ea251 Added compatibility tests for version 3.0.0
7e5f980 Split test protos into two groups
dd04ffb Adding default shell env
58373fa Fix error message for int64 parse error.
11c902e Add IntelliJ project to gitignore for java project.
bd74319 Update Java conformance failure list.
32ad5a3 Use "git reset --hard" to actually reset the code.
b7c813f Update jenkins Java dependencies.
c2b3b3e Update Java version number and dependency.
624d44f Update objective-c conformance failure list.
d582778 Fix C++ distcheck.
fe97d79 Fix MSVC DLL build.
fab8812 Update python conformance failure list.
c52e54f Update jenkins Java maven dependencies.
057a285 Update C# conformance failure list.
e47c068 Update python conformance failure list.
84f6954 Fix Java build.
c348d46 Use PyUnicode_AsEncodedString() instead of PyUnicode_AsEncodedObject()
acde165 Update BUILD file for C# tests.
32d7830 Fix C++ build for down-integration.
d36c0c5 Down-integrate from google3.
4a0dd03 Removes ignored const from return type (#2915)
258406b Merge pull request #2919 from thomasvl/drop_dispatch
130c166 Remove the use of dispatch_once that is heap backed.
ba3fa41 Merge pull request #2918 from thomasvl/xcode_8_3
558ba98 Add support for Xcode 8.3 to the build helper.
04c77c4 Merge pull request #2913 from thomasvl/conformance_ignores
d43eaf2 Fix gcc warning when using map (#2213)
5859932 Merge pull request #2914 from acozzette/nacl
f316375 Added a workaround to allow building for NaCl
8adf57e Add some new ignores for things generated in conformance.
b3f3e12 Merge pull request #2912 from thomasvl/objc_recursion_limit
ddb4388 Raise the recursion limit to 100 to match other languages.
d9e0119 Merge pull request #2858 from haberman/gopackage
c565e25 Merge pull request #1662 from haberman/jsconformance
7610f10 Merge pull request #2884 from anandolee/master
689e4bf Add FormatEnumAsInt support for Json Format. And scale JsonFormatter.Settings to multiple options.
373809e Merge pull request #2897 from cgrushko/patch-5
6f21e29 Compile the Java proto runtime with Java 6
1387a67 Update commit number in Docker to update composer dependency (#2869)
ffa932b Merge pull request #2861 from byronyi/#710
312e2db Update BUILD
db3ef48 Merge pull request #2860 from prehistoric-penguin/master
20181f6 Merge pull request #2854 from hesmar/attributesFix
4d273f2 Merge pull request #2870 from acozzette/memcpy-memmove
15b60bc Merge pull request #2867 from mojoBrendan/master
ea5ef14 Ruby: only link against specific version of memcpy if we're using glibc
c12cc34 Merge pull request #2837 from anandolee/master
6b27c1f Add file option php_class_prefix (#2849)
c0871aa Merge pull request #2848 from xfxyjwf/freebsd
e8e6aa2 Update delimited_message_util_test.cc
89eb4e5 Add option to preserve original proto field names
aec0711 Ruby tests compare parsed JSON instead of raw JSON
1eee320 Add use_snake_case_for_field_names option to JsonPrintOptions
c415a14 fix several issues
44dc555 Merge pull request #2866 from xfxyjwf/nano
ddc0096 Add a notice for nano.
1b0db1c Removed obsolete comments and added docs.
0df2028 Properly regenerated descriptor.proto.
89c3c45 Merge pull request #2859 from haberman/junit-dep-scope
2957703 Merge pull request #2847 from haberman/ruby-toh
0fad0e1 Merge pull request #2794 from acozzette/jspb-extensions
acaa940 add delimited_message_util.cc to libprotobuf.cmake
7008a88 add LIBPROTOBUF_EXPORT to make msvc happy
cb3e84b migrate delimited messages functions to util package
33cc25f Remove duplicated copyright statement
496cd48 Changed scope of Java deps to "test".
e62e30c Changed go_package for plugin.proto.
4842363 Merge pull request #2023 from odeke-em/fix-print-help-to-stdout
e77a09e Incremented Ruby version number to 3.2.0.1
8c40b51 Ruby: update Gemspec.
97cbc42 Fix libprotoc.cmake to generate well_known_types_embed.cc
f23869c Bug fix: When encoding, negative int32 values should be padded to int64 (#2660)
014a550 Ruby: build packages for Ruby 2.4.
c57c77b Merge pull request #2829 from afrantzis/hide-unnecessary-library-symbols
324a299 Made formatting more consistent.
ed0ef54 Merge pull request #2846 from acozzette/bytestream-comment
d18df81 Merge pull request #2855 from thomasvl/copy_note
5e4f14f Document deep copy in the header
b4b855c fix attributes warning
416f909 Fix freebsd build.
9c6b8cb Ruby: fixed Message#to_h for map fields.
95b4427 Build system fixes for JS conformance tests.
43f2db7 Merge pull request #2843 from haberman/check
746ca59 Updated an outdated comment in bytestream.h
874e382 Replace CHECK() with GOOGLE_CHECK().
8df69f0 Conformance test for JS now work, though 15 tests fail.
f0a5c10 Merge pull request #2836 from xfxyjwf/i894
ddfc86b Merge pull request #2835 from pherl/javaep
13d165d Hide unnecessary exported library symbols
d59592a DefaultValueObjectWriter should populate oneof message field
c94555f Double-quote file paths in extract_includes.bat.in
f4f31e7 Suppress the last unchecked warning.
a69bc9d Merge pull request #2822 from anandolee/master
f54fb9d Merge pull request #2832 from pherl/javaep
e11cd3e Merge pull request #2818 from xfxyjwf/i1470
6044b24 Make //:protobuf_python have correct __init__.py.
92064a4 Merge pull request #2824 from xfxyjwf/i1415
cd6eb91 Merge pull request #2826 from xfxyjwf/i1374
81f4fe5 Merge pull request #2827 from xfxyjwf/i1251
81fe52f Fix java code according to error prone.
616e68e Repeated/Map field setter should accept a regular PHP array (#2817)
ae220cd Add auto detect for generated code of WKT protos, addressbook.proto and conformance.proto
03c8c8b Update comments for setSizeLimit.
a1bb147 Merge pull request #2825 from pherl/javawarning
1ece7c0 Add missing thread dependency in cmake.
009e491 Fix GeneratedMessageV3 warnings.
da00355 Merge pull request #2809 from xfxyjwf/i2464
61e87f3 Use per-type table to lookup JSON name.
7c75344 Fix lint warnings in the javalite branch.
cad6a51 Merge pull request #2819 from haberman/pythonexcept
3e6245e update_failure_list.py: fixed Python "raise" statement.
075475f Don't expose gson exceptions in JsonFormat.
bbfb9d5 Merge pull request #2804 from acozzette/ruby-memcpy
8e465dc Merge pull request #2810 from xfxyjwf/i1994
af2d5f5 Merge pull request #2775 from xfxyjwf/fixmajor
9afacb4 Merge pull request #2814 from pherl/javadeprecate
baceff7 Add annotations for deprecated messages in Java
008dc92 Ruby version optionally emits default values in JSON encoding.
9fa4031 Ruby: wrap calls to memcpy so that gem is compatible with pre-2.14 glibc
c311543 Avoid redundant type casts for oneof bytes fields.
4ae8656 Make JsonFormat locale independent.
fa17880 Merge pull request #2602 from GreatFruitOmsk/issue-2428
920af75 Fix Bad Link for Common Lisp
dd8d5f5 Rename encode/decode to serializeToString/mergeFromString (#2795)
769b693 compiler/cli: PrintHelpText prints to stdout instead of stderr
2b7430d Merge pull request #2793 from keveman/master
f6d8c83 Merge pull request #2613 from aausch/fix_memory_leak
21b58b5 Removed a stray return statement, causing compilation error.
651ba62 JS: ensure that extension values are serialized even if they're falsy
06f9f60 Detect if Descriptor.cs changes for csharp
f873d32 Added JavaScript conformance tests.  All tests pass!
27b1f2b WIP.
0c0a887 Merge pull request #2751 from keveman/master
008ff03 Merge pull request #2784 from acozzette/log-2-floor-int
6837b2d Added comment explaining the protobuf_headers target.
af13bff Detect if Descriptor.cs changes for csharp
8610d0a Merge pull request #2755 from xfxyjwf/rubycomp
352526c Merge pull request #2785 from thomasvl/threading_race
2d1c5e2 Handing threading race resolving methods.
938206d Return uint32 from Log2FloorNonZero64
a7e3b0a Merge pull request #2774 from acozzette/closure-builder
8b182cc Disable static analyzer for message semaphore creation (#2748)
6011d7c Fix gcc 4.1 build (#1035) (#1913)
25ecd55 Change hint type to `const void*` (#2568)
bcbaabe Add mergeFrom method on Message (#2766)
671e075 Use closurebuilder.py in favor of calcdeps.py for compiling JavaScript
7339fc0 Merge pull request #2674 from acozzette/js-test-cleanup
b7f25d4 Undef major/minor if they are defined as macro.
aff9d9d Removed log statement from writer_test.js
5274d6e Merge pull request #2770 from xfxyjwf/fixcmake
83b0cc9 Merge pull request #2772 from sschuberth/master
606cb7e There might be duplicated enum values when allow_alias is true. Add PreferredAlias into OriginalNameAttribute to remove the duplication (#2727)
902af08 Prefer the term "3-Clause BSD License" over "New BSD License"
6395a1c Fix links to the New BSD License in meta-data
ffde972 Remove the use of C++11 features.
9118ad6 Add Ruby compatibilty test against 3.0.0.
d41c47f Merge pull request #2733 from wmamrak/patch-1
8d61f9c Merge pull request #2729 from MarcelRaad/fix_inline_msvc12
b4b0e30 Merge pull request #2355 from xfxyjwf/fixjson
8387b88 Merge pull request #2732 from AsturaPhoenix/master
66c64e7 Merge pull request #2747 from liutikas/master
8c8b8e6 Merge pull request #2734 from msabramo/patch-1
72b82e6 Merge pull request #2630 from blodan/master
963473b Merge pull request #2753 from thomasvl/recursive_drop
d071766 Add GPBMessageDropUnknownFieldsRecursively() and tests.
2d430f8 Added a header only cc_library target for the protobuf library.
17174b5 Updating README
f83d129 Upgrading test-related libraries
c9b2c8f Fixes for .NET 3.5 compatibility
a434bfc Fix compiler warnings about unused variables in generated_message_reflection.h
1a8cbfd Merge pull request #2736 from na-ka-na/master3
172e0a6 Add an option to always print enums as ints in Json API
86208c5 README.md: Make docs URL a link
37bd5d5 Disable MSVC warning C4309
01a05a5 const FieldDescriptorCompare
8f9c0a4 Fix unresolved symbols with MSVC12 and /Zc:inline
a9ab38c Merge pull request #2722 from ckennelly/unified
8af35f2 Keep loop bounds in a local variable for string fields.
a6c30d9 Keep loop bounds in a local variable.
9db5b11 Work with truncated tag numbers.
0026dff Expose rvalue setters for repeated string fields.
38b1440 Merge pull request #2663 from ckennelly/varint-size
15360e5 Merge pull request #2689 from ckennelly/aliasing-fixed32-fixed64
38c238e Improve support for plugin parameters.
d2dfe46 Merge pull request #2609 from yixiang/patch-1
3f6f73b Merge pull request #2701 from anandolee/master
ed423c2 Merge pull request #2451 from podsvirov/json-primitive-map
aa78aeb Merge pull request #2704 from liutikas/master
74eb9a0 Add clear method to PHP message (#2700)
46c022a JsonUtilTest: Add ParsePrimitiveMapIn subtest
9079079 Fix compiler warnings about unused variables in wire_format.h
ef927cc Switch to gcc atomic intrinsics for macOS and delete the file that uses (#2699)
7288689 Add csharp compatibility tests against v3.0.0 and run on Travis.
1d2736f Merge pull request #2656 from pcj/patch-1
9f09d18 Add proto and test files for csharp compatibility tests against v3.0.0. All the files are copied from 3.0.0 (JosnFormaterTest was deleted)
c6e0d0e Merge pull request #2647 from anandolee/master
42e1e2a Fix python compatibility test when a new generated code imports an old version(2.6.1 or older) generated code.
e844510 Merge pull request #2692 from cgrushko/patch-3
65a4d20 Update load() statement to latest style
a60cc08 Merge pull request #2691 from cgrushko/patch-3
cba04b1 Implement json encoding decoding for php. (#2682)
6fffd4a Bazel can build protobuf when it's not in the root
56da828 Avoid aliasing CodedInputStream::buffer_ when parsing little endian integers.
c002743 Add fixed version to phpunit used in travis (#2673)
36f51f9 Merge pull request #2681 from sergiocampama/spaces
e7f5c9d Removes trailing whitespaces
5a3405c Update upb for php. (#2662)
bd29f86 Fix CopyTo argument validation
36f68e0 Inline branch-less VarintSize32/VarintSize64 implementations.
3975664 Add conformance test for php (#2655)
7b54b34 Add bazel protobuf resources
7f3e237 Merge 3.2.x branch into master (#2648)
9d3288e Merge pull request #2226 from kchodorow/master
2c36cc3 cache generated classes, optimization and quick workaround to memory leak
afc59ab C#: Implement IReadOnlyList<T> in RepeatedField<T>
87e4976 Merge pull request #2639 from anandolee/master
1ee09c8 python: do not include internal 'strutil.h' header
a83ac86 fix compile error on centos in metadata.h for constructors. (#2599)
a323f1e Oneof accessor should return the field name that is actually set. (#2631)
ccb76ff Add Oneof custom options test
5af0b54 Merge pull request #2619 from anandolee/master
1f09786 Merge pull request #2633 from anandolee/jieluo_branch1
ea51149 Add python compatibility tests against v2.5.0 amd run on Travis.
c15217f Allow OneofOptions to be extended in proto3.
c2a5669 Merge pull request #2626 from sergiocampama/8_3
32fa55e FreeBSD compatibility
4e7ecde Update genfiles paths to work with a different execroot arrangement
3d7b42d Adds nullability modifiers to resolve Xcode 8.3 warnings
140e28d Add python compatibility tests against v2.5.0: copy tests and proto files from v2.5.0
0aa5af3 Merge pull request #2614 from acozzette/gzip-output-stream-options
a101fa5 Set LIBPROTOBUF_EXPORT on GzipOutputStream::Options
e41b667 Undef TYPE_BOOL to avoid conflict with iOS.
047575f Support custom options in C#
eed9951 Merge pull request #2591 from thomasvl/objc_timestamps_take2
24908e1 Update AbstractMessage.java
d0bc096 Timestamp helper fix, Duration helper cleanup.
c9cd6ac Merge pull request #2587 from google/revert-2586-objc_timestamp
1651342 Revert "Fix Timestamps with dates before the Unix epoch that contain fractional seconds."
cf477d4 Merge pull request #2586 from thomasvl/objc_timestamp
adcccd0 Fix Timestamps with dates before the Unix epoch that contain fractional seconds.
feb78fb Merge pull request #2584 from cgrushko/patch-2
e4baf3f Add a proto_lang_toolchain for Java
e53dd99 Merge pull request #2529 from wackoisgod/master
fac90c6 Update AbstractMessage.java
75ac397 Fixing code formatting issues
effcb13 Merge branch 'master' of https://github.com/wackoisgod/protobuf
228d242 Merge pull request #2567 from acozzette/distcheck-fix
fb71df9 Add ByteString.FromStream and ByteString.FromStreamAsync in C#
e76d91a Add global.json file to pick dotnet core SDK version.
0bdf4a6 Fixed "make distcheck" for the Autotools build
2bddffc PHP fix int64 decoding (#2516)
2c16f69 Fix generation of extending nested messages in JavaScript (#2439)
ffa71f8 A few more cases for binary conformance tests. (#2500)
1041710 Merge pull request #2565 from acozzette/cross-compilation
24f0d56 Merge pull request #2563 from thomasvl/autocreator_tweaks
988ffe0 Minor fix for autocreated object repeated fields and maps.
2deb139 Merge pull request #2544 from tiziano88/master
17a8a76 Merge pull request #2536 from jbrianceau/fix-js-embed-include-style
c55175c Merge pull request #2564 from acozzette/arena-nc
a592052 Removed arena_nc.cc and arena_nc_test.py
b40d318 Fixed cross compilations with the Autotools build
f3e86fd handle sanity check for repeating enums correctly
4cb113a Fixed issue with autoloading - Invalid paths (#2538)
30250cd Add link to Elm proto plugin
5f65ee6 Merge pull request #2542 from jbrianceau/fix-embed-cc-warning
4455cdf Fix include in auto-generated well_known_types_embed.cc
05b019a Fix warning in compiler/js/embed.cc
f52e188 Merge pull request #2523 from jbrianceau/init-index-in-metadata
137dc02 Merge pull request #2525 from camillol/lite
abe1725 simpler, cheaper callback to LazyStringOutputStream
4e229c8 add MethodResultCallback_0_0
15a15e3 Init index_in_metadata_ without condition
6c021b3 Added the support for class level deprecation which will in turn also deprecate any fields that are currently using that type
d948b66 Merge pull request #2521 from acozzette/fix-bazel
5731ca5 Added well_known_types_embed.cc to CLEANFILES so that it gets cleaned up
ee0a243 Updated Makefile.am to fix out-of-tree builds
d1e7bd9 Added Bazel genrule for generating well_known_types_embed.cc
cdc2766 Merge pull request #2506 from ckennelly/rvalue-setters
bb2c6b2 Merge pull request #2505 from ckennelly/master
c64830b unwrap descriptor class before comparison of RepeatedField types
fb15862 Merge pull request #2517 from acozzette/js-embed
98d89d4 Fixed "make check" for cmake build
1b3a0c1 Auto-generate well_known_types_embed.cc
183d31c Add rvalue setters for non-arena strings on C++11.
f39cf88 Merge pull request #2227 from KindDragon/3.1.x
b18bc9b Give C# ByteString a sensible GetHashCode implementation.
ef61a9d add a key to ctx.action dict to prevent protoc losing the default env
ba63fa7 Remove spurious NULL checks in ArenaStringPtr::CreateInstance.
a95e38c Merge pull request #2499 from ckennelly/master
a902e25 Define LANG_CXX11 for port.h and use this to guard C++11 features.
83d681e Merge pull request #2495 from acozzette/android-hash
d1c1dad Merge pull request #2498 from sergiocampama/enum
c68fc58 Fixes and expands comments on how to use GPB_ENUM_FWD_DECLARE
2ff42dc Added conformance testing for binary primitive types. (#2491)
f983302 Merge pull request #2496 from xyzzyz/fix-overflow
837fd6b Merge pull request #2493 from jbrianceau/add-missing-climits-include
f77b7d9 Merge pull request #2484 from ngg/uwp_build
4c5d3ed Fix integer overflow in FastUInt32ToBufferLeft
5587562 Removed Android-specific code from stubs/hash.h
7b220f3 Add missing include in embed.cc
9d709f4 Merge pull request #2487 from jtattermusch/csharp_leading_whitespace
83c728a Add missing includes
eb455ce Merge pull request #2471 from jbrianceau/fix-include-style
6b86a89 Merge pull request #2490 from xfxyjwf/icon2
4c252dd Add a badge for bazel build status.
ec021f5 Add support for Windows ARM builds
29fb87e Merge pull request #2454 from pongad/go_package
fda9049 remove leading whitespace in C# xml comments
277a8b6 generate_changelog.py: flush output so piping works correctly.
8494846 Merge pull request #2476 from acozzette/generated-message-reflection-fix
dc11940 Merge pull request #2475 from sergiocampama/rvm2
e19f3b5 Use uint32 in GOOGLE_PROTOBUF_GENERATED_MESSAGE_FIELD_OFFSET macro
1ac8944 Reenable cocoapods objc test and remove unnecessary workaround for rvm
e43f73e Merge pull request #2473 from thomasvl/update_xcodes
70e21d7 Mark objectivec_cocoapods_integration as failing
bc9d077 Skip benchmark test if cmake isn't installed.
2754586 Xcode 8.1 support
e3da722 Fix #include in cc files
243ebec Merge pull request #2468 from sergei-ivanov/patch-1
d8f54c4 Disable jruby test. (#2469)
09dc933 Update third_party.md
e3e38b8 Update commit id in Dockerfile to trigger update. (#2467)
4474c04 Merge pull request #2462 from jbrianceau/fix-comp-builds-part2
6b4eee4 Merge pull request #2461 from jbrianceau/add-missing-include-in-embed-cc
34a1b6e Merge pull request #2394 from cwelton/formatting
46ae90d Make php generated code conform to PSR-4. (#2435)
631f461 Merge pull request #2466 from thomasvl/deprecation_followup
4c84b47 Merge pull request #2460 from sergiocampama/c11
dad775b Improve ObjC deprecated annotation support.
e219be7 Regenerate descriptor proto
b1295ee C++: export _xxx_default_instance_ symbols
867dbac Add missing include in embed.cc
c836ad4 Merge pull request #2459 from acozzette/android-logging
1d91c25 Include -std=c++11 when compiling protobuf if available.
2f29f0a Send all protobuf logging to logcat by default on Android
057389c Ruby: removed redundant RepeatedField#slice. (#2449)
850d573 Merge pull request #2407 from jbrianceau/fix-comp-builds
788d14a Export symbols used in inline functions
bb77cbf update descriptor.proto's go_package
607b921 Merge pull request #2437 from xfxyjwf/plugin
1a56251 oneOf fix for JsonFormat includingDefaultValueFields
ced8f73 Add version number to plugin protocol.
0509072 Merge pull request #2445 from ramrunner/master
c664b66 Merge pull request #2442 from pherl/fix-bazel
4cc160e when on OpenBSD we include the correct headers for endianess and check the apropriate defines
f92b455 Add missing files.
f1ce60e Factored Conformance and Benchmark test messages into shared test schema. (#1971)
4280c27 Merge pull request #2436 from cgrushko/patch-1
45d92ae Add a proto_lang_toolchain() for cc_proto_library
6b60ddd Merge pull request #2431 from saintstack/2228v2
7550bcd Change CodedInputStream#DEFAULT_SIZE_LIMIT from 64MB to Integer.MAX_SIZE (0x7FFFFFF) #2228
f8ca3ac Generate phpdoc in php generated files. (#2406)
34dc96b LIBPROTOC_EXPORT added to others functions in csharp_names.h and objectivec_helpers.h
b790da5 Missed LIBPROTOC_EXPORT for GRPC added
851cb81 Merge pull request #2429 from thomasvl/issue1833_swift_prefix
f813bd9 Add a swift_prefix file option.
39f9b43 Merge pull request #2403 from google/down-integrate-with-msvc-fix
65479cb Fixed Ruby tests for JRuby 1.7
259dd7e Updated descriptor_pool.py to be compatible with Python 3
db35fe7 Add a "u" suffix to tag numbers in generated code
a7f300d Fixed descriptor_pool_test.py for Python 2.6
c950471 Merge pull request #2404 from wiktortomczak/master
0fa31b2 Support grpc plugin in py_proto_library
a41090e Updated failure_list_java.txt for Java conformance test
72002d8 Merge pull request #2400 from acozzette/jspb-test-fixes
fda876a Added back in binary serialization round-trip in message_test.js
b763246 Merge pull request #2398 from jbrianceau/no-static-init-define-fix
04bd614 Merge pull request #2392 from xfxyjwf/fixdown
40f3586 Fixed remaining JSPB test failures
315350b Updated message_test.js so that it does not depend on fromObject
b4dd686 Updated enum names in test.proto to avoid conflicting with testbinary.proto
a5c30ce C++: Fix use with GOOGLE_PROTOBUF_NO_STATIC_INITIALIZER
7807932 Restore jenkins files.
599613e Update EXTRA_DIST lists.
1f077a0 Update conformance failure lists.
530ae30 Add back missing LIBPROTOBUF_EXPORT.
1673389 Updated libprotoc.cmake
5d63097 Merge branch 'master' into down-integrate-with-msvc-fix
5a76e63 Integrated internal changes from Google
cd315dc Merge pull request #2383 from snapsam/patch-1
44edefa Merge pull request #2382 from zhsyourai/master
d571d39 Update README.md
14c147e Add LL to large constant
cb54caf Add tests to demonstrate json parsing for null Timestamp and Duration types
99564c3 Rename Empty to GPBEmpty in php generated file.
fc27ead Merge pull request #2362 from wujingchao/patch-1
a3dfbe6 Merge pull request #2378 from ianfhunter/patch-1
01d321c Merge pull request #2367 from jbrianceau/add-missing-include-in-message-lite-cc
130bb07 typo
83d6411 Fix jenkins tests.
bf37971 Merge pull request #2323 from marcinwyszynski/master
c6f3d70 Merge pull request #1907 from evokly/js-utf8-fix
bd850a2 JS: Well, this is the right place for surrogates.
292c2c9 JS: Re-added comment, moved surrogates code to the right place
6e93fa4 Merge pull request #2366 from jbrianceau/reland-fix-include-js-generator
a95d052 Merge pull request #2368 from jbrianceau/fix-json-style-in-project-json
3c0e0ce Fix csharp/src/Google.Protobuf.Test/project.json
aac9ed8 Add missing include in message_lite.cc
3316062 Fix #include in js_generator.cc
74a636a Move variable declarations before actual code
0c0f7e2 Merge pull request #2364 from haberman/jslicense
5d9dbe3 Fixed JavaScript license declaration.
abeff7b Class is final but declares protected field
a30e5af Merge pull request #2358 from ckennelly/master
e02cb2b Resolve old TODO for StringTypeHandler.
8a4b2c5 Merge pull request #2353 from guptasu/master
df56736 Merge pull request #2357 from D3Hunter/master
ed08341 #2356 : fix ExceptionInInitializerError on IBM J9
bd158fc Speed up JSON parsing.
df83907 Fix php c extension on 32-bit machines. (#2348)
9f75c5a Merge pull request #2337 from sergiocampama/deprecation
fce8b6b Made helper code also consider package name 'proto2' when dealing with MessageOptions.
e75cf40 Fixing references to the removed atomicops_internals_pnacl.h file.
ce5160b Merge pull request #2332 from Kwizatz/master
d584321 Merge pull request #2327 from ctubbsii/fix-maven-compiler-plugin
1f2dbc8 Implement RepeatedFieldIter for c extension. (#2333)
ecc460a Renamed the pnacl version of atomicops.h into C11 atomic, and flagged the mac version to that if atomic is enabled
975d577 Added explicit cast to avoid size warning on Win64.
0d7199e Merge pull request #2329 from acozzette/unused-parameter
ca800ea Merge pull request #2330 from manupsunny/master
089aaa9 Fix message for InvalidProtocolBufferException
8785004 Fix unused parameter warnings in arena_free
eb7f3a3 Use latest maven-compiler-plugin (2.6.0)
7bd11fc Merge pull request #2301 from jbrianceau/arm-atomic-kuser-helpers-fix-v2
4c310d7 Update conformance test failure list
50a37e0 Change JSON field name formatting
cb81314 Fix copy pasta in test
3bdaaa5 More Ruby-eqsue interface
93a4fa2 Merge pull request #2302 from jbrianceau/generic-gcc-atomics-strong-cmpxchg
750b9e2 Merge pull request #2318 from mavrukin/patch-1
4d813de Fix compiler warnings when running :protobuf_test
8fcd24b Merge pull request #2307 from sinzianag/swift_protobuf
ea928a1 Merge pull request #2309 from thomasvl/note_about_coding_support
98c0185 Add note about extension use and Coding support.
9819b72 Adding Apple's Swift Protobuf
a5a2c1d generic atomicops: Use strong compare_exchange
4587a3f [arm/gcc] Don't rely on KUSER_HELPERS feature for atomics
d58b92a Adds pushLimit: and popLimit: into GPBCodedInputStream (#2297)
795976e Trigger update of docker for new changes in #2282. (#2288)
e9d9e56 Merge pull request #2290 from ramrunner/OpenBSDsupport
bfc1299 define no_threadlocal on OpenBSD
734930f Merge pull request #2284 from pherl/plugin_opt
2e314a6 Add comments about converting directives into PluginName
59cd5d0 Support extra parameters for plugins.
51c5ff8 Fix pure php implementation for 32-bit machine. (#2282)
58580da Merge pull request #2274 from nmittler/gae
c0f95c6 Hacking ByteBufferWriter to work with GAE
9c6940f Merge pull request #2264 from rshin/master
7c913d8 Use -DPROTOBUF_PYTHON_ALLOW_OVERSIZE_PROTOS
df5841f Place Python extensions correctly in Bazel build.
008b5a2 Merge pull request #2254 from JasonLunn/patch-1
9ec7a47 Use git clean before installing via bundler
993d604 Merge pull request #1959 from abergmeier-dsfishlabs/feature/cpp
ad88f90 Merge pull request #2251 from TeBoring/master
686a19c Merge 3.1.x into master.
e28286f Add 32-bit machine test on jenkins. (#2245)
afaa827 Merge pull request #2246 from fweikert/patch-1
c2b3e70 Declare all inputs of protoc action
60d95f3 Fix the bug that message without namespace is not found in the descriptor pool. (#2240)
0321baf Merge pull request #2203 from mrry/msvc_fix
7332ffb JS: Replaced fromCodePoint/codePointAt with fromCharCode/charCodeAt because of functions limited availability, fixed typo in tests.
fd046f6 Merge pull request #2234 from TeBoring/master
daba665 Fix python_cpp test on Mac. Link staticly when building extension, so that the extension doesn't require installing protobuf library. (#2232)
5be6325 Add csharp/build_tools.sh for dist check.
63448e6 Fix compile error for php on Mac.
4f3d20a Fix segmentation fault when ZTS is defined.
1e5d4ba PHP: fix ZTS tsrm_ls errors (#2189)
c96dd66 Add test for php zts build.
447dee1 Prepare jenkins for testing php zts build.
7047761 Do strict enum name checking only for proto3
34cdaf4 Add travis test on Mac for php.
c8bd36e Test php5.5_c test on jenkins
c0719f0 Trigger automated tests for php.
91bbff2 Set up environment for php automated tests.
fd332d1 Add php test script for automated tests.
89db7ce Add script to build Google.Protobuf.Tools for csharp.
f2eeaf7 Added alias getFieldProto3 as used by older generated code.
4a4a162 Fixed references to foreign nested messages with CommonJS-style imports
0bf5482 Fixing inconsistent php version number.
ae9455c Fix MSVC stack overflow issue.
e0a6e52 Add php files for make dist.
17cc42a Update change log for 3.1.0 (#2173)
8db25a2 Force a rebuild of Jenkins docker image.
a41349d php: support 5.5.9 for pecl extension (#2174)
39685f6 Fix VS test failures.
2937c67 Use "appveyor DownloadFile"
b155958 Update php supported version.
9e260a3 Reduce test length to avoid stack overflow on VS.
1c4cf0e Fix python cpp.
460e7dd Fix Visual Studio compile issues.
4cf0722 Fix default instance destructor
f9f3c35 Update version number.
0c78525 Bump library version to 11
e62d1f1 Add back removed descriptor field.
fac876d Modify php api version and minimum supported php version.
9372292 PHP: support 7.0 on PHP implementation (#2162)
f933d10 Update version number.
04d72c2 Fix java compatibilty tests.
9d4657a update files to include php generators
633fdfb Update minimum support php version to 5.5
9526385 Fix bugs for internal integration.
89d8e43 Fix travis, jenkins environment issues.
7fecfdb Added new has_bits.h file to cmake/extract_includes.bat.in (#2152)
f7bb847 Fixed quadratic behavior in JSPB deserialization of repeated fields (#2117) (#2146)
31dd499 Fix bugs for csharp and ruby for internal integration.
23d4688 Fix python bugs for internal integration.
93007e1 Bump library veriosn to 3.1
fe1aaad Fix bugs for internal integration.
15f4db6 Bump version number to 3.1.0-alpha-1.
af62fde Fix for maps_test.js in JavaScript. (#2145)
e3f0689 Fix bugs for internal integration.
cc8ca5b Integrate internal changes
337a028 Merge pull request #2236 from hochhaus/master
b2b6584 Silence compile warnings in bazel
bc3bff1 Fix python_cpp test on Mac. Link staticly when building extension, so that the extension doesn't require installing protobuf library. (#2232)
c0c3aee Merge pull request #2229 from xfxyjwf/fixbuild
431cee6 Remove inexist files from build.
d947308 update files to include php generators (#2165)
b0d62dd Add csharp/build_tools.sh for dist check.
96e2d76 Fix compile error for php on Mac.
75b6988 Fix segmentation fault when ZTS is defined.
52ab3b0 PHP: fix ZTS tsrm_ls errors (#2189)
71e5994 Merge pull request #2193 from acozzette/common-js-fix
60bc203 Merge pull request #2199 from JasonLunn/patch-1
2d897c8 Add test for php zts build.
3a055be Prepare jenkins for testing php zts build.
640c947 Merge pull request #2204 from acozzette/enum-names
149659b Do strict enum name checking only for proto3
858db7a Add travis test on Mac for php.
4d87527 Merge pull request #2218 from thomasvl/xcode8_updates
bcb32c0 Test php5.5_c test on jenkins
297449a Update the ObjC projects for Xcode 8
fe1d0a1 JS: Added string encoding/decoding tests for UTF-8
23f108d JS: Fixed UTF-8 string encoder/decoder for high codepoints.
350d494 Merge pull request #2206 from acozzette/gitignore
dae50dc Trigger automated tests for php.
f17528a Set up environment for php automated tests.
1baaedf Add php test script for automated tests.
25dbc8b Updated .gitignore with Java and JavaScript build artifacts
6005648 Add development dependency requirements
1d2c7b6 Fix MSVC build when HAVE_LONG_LONG is defined.
ee19a7c Remove hanging reference to Gemfile.lock
d94ff41 Delete Gemfile.lock
07f3cab Set platform to "java" under JRuby
b9bad65 Add script to build Google.Protobuf.Tools for csharp.
3b88a4a Merge pull request #2196 from haberman/nodefix
3d598ee Merge pull request #2197 from thomasvl/swift_docs
fb85b43 Drop the swift docs directory (and content).
8b54280 Added alias getFieldProto3 as used by older generated code.
c4d7012 Fixed references to foreign nested messages with CommonJS-style imports
3f59452 Merge pull request #2192 from google/3.0.x
787f3fb Fixing inconsistent php version number.
c531b49 Merge pull request #2181 from google/jtattermusch-patch-2
530eede Update README.md
4d3e4cf Fix MSVC stack overflow issue.
a428e42 Add php files for make dist.
e7269aa Update change log for 3.1.0 (#2173)
87d61e4 Merge pull request #2175 from xfxyjwf/fix_json
5248f61 Force a rebuild of Jenkins docker image.
907a075 php: support 5.5.9 for pecl extension (#2174)
3c88e1b Fix VS test failures.
c454b44 Merge pull request #2171 from xfxyjwf/fix_json
73c8723 Use "appveyor DownloadFile"
18f336b Update php supported version.
f6b4d18 Reduce test length to avoid stack overflow on VS.
411968d Fix python cpp.
37c3d05 Merge pull request #2167 from xfxyjwf/fix_vs
0798d91 Fix Visual Studio compile issues.
3bfc09d Merge pull request #2170 from pherl/3.1.x
f184cb6 Fix default instance destructor
34996c3 Merge branch '3.1.x' of github.com:google/protobuf into 3.1.x
a289d43 Added C++ benchmark. (#1525)
d9ff3ef Merge pull request #2153 from haberman/generatechangelog
8c88762 Update version number.
94bbeb4 Bump library version to 11
1b7a844 Add back removed descriptor field.
9cb812f Modify php api version and minimum supported php version.
11433f7 PHP: support 7.0 on PHP implementation (#2162)
39a2a25 update files to include php generators (#2165)
2649844 Update version number.
5270e93 Fix java compatibilty tests.
923314c update files to include php generators
1bf97d8 Merge pull request #2159 from google/jtattermusch-patch-1
da7c026 Update README.md
ca21b28 Merge pull request #2157 from google/csharp_remove_beta_notice
ca8120c Update README.md
8352d52 Update minimum support php version to 5.5
bee5213 Fix bugs for internal integration.
525c632 Fix hash computation for JRuby's RubyMessage
b8e7e89 Fix travis, jenkins environment issues.
b28ab73 Fix gson dependency. gson 2.3 has internal bug that it doesn't work with some versions of maven.
142e2fa Merge pull request #2035 from sergiocampama/cpp
a2e7364 Added convenient script for generating changelog draft.
e25c56a Merge pull request #2149 from khingblue/remove-obsoleted-project
050c014 Added new has_bits.h file to cmake/extract_includes.bat.in (#2152)
f81d44f Fixed quadratic behavior in JSPB deserialization of repeated fields (#2117) (#2146)
741aa87 Remove obsoleted project of j2me
b9bc990 Fix bugs for csharp and ruby for internal integration.
c8b9d41 Fix python bugs for internal integration.
1af7c4c Fixes static analyzer issues from xcode.
08e9f70 Bump library veriosn to 3.1
22d7248 Fix bugs for internal integration.
4f379f8 Merge pull request #2144 from abscondment/fix-jruby-hash
6cb6bd9 Fix gson dependency. gson 2.3 has internal bug that it doesn't work with some versions of maven.
ebcda12 Bump version number to 3.1.0-alpha-1.
679381f Fix for maps_test.js in JavaScript. (#2145)
05aa0df Fix hash computation for JRuby's RubyMessage
a2c6501 Fix bugs for internal integration.
0dca5a5 Use a custom dictionary to avoid NSNumber operations.
96c469d Remove the custom key functions and just use the system provided defaults.
98835fb Integrate internal changes
7b00595 Merge pull request #2137 from thomasvl/objc_extensions_tweak
6ab51a0 Use a custom dictionary to avoid NSNumber operations.
5904279 Remove the custom key functions and just use the system provided defaults.
b5bbdb0 Merge pull request #2037 from abscondment/fix-2036-ruby-hash
c44ca26 Merge pull request #2130 from kilink/substring-comment-fix
9ac84f8 Fix erroneous comment regarding String.substring
3b001ca Some php engine implementation doesn't have return_value_ptr properly set. Explicitly use &return_value.
c6fa9c7 Auto-generate proto files for tests.
f174d36 Add back missing test proto files.
37e0e1f Merge pull request #2111 from pherl/3.0.x
9c4be5f Merge pull request #2112 from pherl/merge
b4235ac Merge pull request #2123 from thomasvl/objc_better_versioning_take2
1aa6500 Update the ObjC version checks to support a min and current version.
e0e5466 Check in php implementation. (#2052)
683412b Update generated files.
58860c0 Merge remote-tracking branch 'origin/3.0.x' into merge
13d6d17 Fix the version number for 3.0.2
86fcd87 Merge pull request #1765 from mbarbon/master
1affbd8 Merge pull request #2021 from zlim/bench-fix
4f032cd Merge pull request #2100 from vladmos/patch-1
22e7fa6 Merge pull request #2092 from dprotaso/master
5caf516 Resolved a conflict
78aee1b Merge pull request #2044 from wychen/Win32ANSI
643a02b Merge pull request #1636 from yugui/feature/generic-plugin
53387e5 Merge pull request #2090 from guoxiao/find
e90292b Merge pull request #2103 from adrianludwin/fix-gtest
1327e6f Update repo to use google test
7377eb2 Merge pull request #1970 from thomasvl/objc_any_helpers
a86e6d8 Compatibility with the new version of Bazel.
5d35e60 Merge pull request #2094 from thomasvl/update_wkt_comments
57170b9 Merge pull request #2096 from pherl/3.0.x
5699b92 More complete nil/reset tests within a oneof
708296e Fix some constants to be correct for the message class in use.
161b937 Fix error and add note about lossy issues
14e74f6 Support the -Wassign-enum compiler flag. (#2085)
1fc416b Allow the JsonFormat.Parser to ignore unknown fields
a15df74 Merge pull request #2087 from khingblue/fix-unused-param
337ec30 Add ObjC helpers for Any WKT.
82133ba include std::find()
08b1c71 Fix #2032 unused parameter 'deterministic'
4bc1657 Merge pull request #2079 from khingblue/fix-generate-descriptor
30e55ae Merge pull request #2083 from pherl/3.0.x
7645a3d Merge pull request #1884 from hochhaus/valueWriterFn
f9fc56c Fix #2071 replacing /bin/sh with bash
1a58673 Merge pull request #2077 from pherl/3.0.x
e298ce5 Update release date in the change log
74638a2 Merge pull request #2047 from jonathon-love/master
65a595d Merge pull request #2062 from pherl/3.0.2
01d1750 Merge pull request #2061 from pherl/changelog
9b8da10 Rm check on dependency in the C# reflection API (#2051)
96a9d97 Merge pull request #2059 from chih-hung/master
7c3f7c6 Fix #1955 clang-tidy warning misc-macro-parentheses
6e11540 Bump version number to 3.0.2
0c9999f ruby, js and c# changes
a098e80 Merge pull request #1862 from pherl/3.0.0-GA
4041ed4 objc change log
9befe47 Merge pull request #1978 from pherl/cp
9d30a98 Merge pull request #2026 from pherl/cpjs
fa6428e Merge pull request #2045 from mike07026/master
85c1adf Merge pull request #2053 from thomasvl/improve_root_registry_wiring
64958cd Fix to typo/oversight in python tests
a7eaf36 Rename UNICODE to protobuf_UNICODE
13a4124 Make Root's +extensionRegistry generation smarter.
d00cab0 Merge pull request #2039 from khingblue/remove-unused-vector
df6088a detect invaild JSON encoding in bytes field
5a17660 detect invaild JSON encoding in bytes field
e514f23 fix #1342 cause by ownership issues
48811b2 Fix Win32ErrorMessage on Unicode build
11d6cb5 Add test for Win32ErrorMessage
588a803 Support Unicode build on Windows
b964976 Merge pull request #2024 from pstavirs/master
52e491b Use 64-bit protoc binaries in compatibility tests.
8ee6f56 Remove unused vector
de02863 fix #2036: use `rb_hash_*` to accumulate hashes
047419a failing test for #2036:
c0a6a6b Merge pull request #2033 from frett/osgiExport
b6dec9b update the OSGi SymbolicName and ExportedPackage for the javanano library
dd45c0b Merge pull request #2012 from haberman/rubymapgcfix
8c93606 Merge pull request #2031 from thomasvl/dont_require_filegenerators
78a6d31 Speed up ObjC Generation with large dependency trees
e721ce6 Merge pull request #2012 from haberman/rubymapgcfix
8335b7d Fix missing import of jspb.Map (#1885)
3a674ff upb bugfix: JSON map entry keys were passing the wrong closure.
eedc7be Restore New*Callback into google::protobuf namespace since these are used by the service stubs code
7e62773 Merge pull request #1920 from gegles/master
c32b9dd Merge pull request #2018 from thomasvl/support_generate_all
f5c7a48 benchmarks: update readme.txt
4fd4471 Merge pull request #2014 from pherl/fixgmock
2e66a61 Support GenerateAll().
1760feb Update gmock links.
1e6dc7d Update links in appveyor.yml
d4213d8 Ruby: make sure map parsing frames are GC-rooted.
3d9d1a1 Merge pull request #2013 from xfxyjwf/gmock
c4a84ab Update links in appveyor.yml
bba446b Update gmock links.
08951c3 Merge pull request #2011 from tomas-abrahamsson/patch-1
4d04fcd Add an Erlang project, gpb, to third_party.md
b97a4a5 Merge pull request #2001 from nicolasnoble/patch-1
866d3e5 Fixing regular expression...
569d5ce Merge pull request #1997 from thomasvl/move_include_package_into_helpers
290d26b Remove the compiler options from ImportWriter.
93362a5 Move the ImportWriter into the ObjC Helpers.
80f65d2 Add note about JSON tests maybe being wrong. (#1992)
b5794ed Merge pull request #1984 from thomasvl/more_json_tests
7437774 More JSON tests
ff2a660 Adds better support for protos without packages  (#1979)
8b30145 Add a jenkins test status badge.
45d04d0 Merge pull request #1977 from thomasvl/bump_cocoapod_spec
584917f Bump the version in prep for the 3.0.2 tag being cut
564c02f Merge pull request #1975 from pherl/cp
a877fdf Record zero for "has" for proto3 if in a oneof.
116596a Never use strlen on utf8 runs so null characters work.
62f2ff8 Fixes extra whitespace on generated comments. (#1950)
a989501 Adds support for appledoc in generated code.  (#1928)
42ab9b4 Migrating documentation of the ObjectiveC runtime code to appledoc. (#1867)
549dde1 Merge pull request #1967 from sergiocampama/cast
e505098 Adding casts so that code importing protobug using -Wconversion does not generate warnings.
e389165 Add more JSON tests around underscores (#1963)
4763e64 Merge pull request #1957 from xfxyjwf/jenkins_badge
d9ccf4d Merge pull request #1964 from thomasvl/missing_ignores
8156410 Fix up ignores and conformance generation
336ee28 Merge pull request #1960 from jskeet/oneof
f9d93f3 Regenerate conformance files to include extra oneof fields.
bbeb983 Need to expose generated protobuf C++ headers so they can actually be accessed from other libraries.
f8c37b9 Add a jenkins test status badge.
a248420 Fixes extra whitespace on generated comments. (#1950)
cd561dd Merge pull request #1949 from thomasvl/objc_more_reset_tests
ff85a17 More complete nil/reset tests within a oneof
a0df678 Fix some constants to be correct for the message class in use.
17d601a More explicit tests for nil behaviors on fields.
91b6d04 Merge pull request #1942 from thomasvl/objc_fix_oneof_zeros
27c8962 Add more types to the zero oneof cases.
ca5b775 Record zero for "has" for proto3 if in a oneof.
ac3df39 Add conformance test for zero fields in oneofs. (#1939)
30bbbe9 Merge pull request #1934 from thomasvl/objc_strings_with_null
1a6c1d0 Never use strlen on utf8 runs so null characters work.
237f321 Adds support for appledoc in generated code.  (#1928)
56b8f44 Merge pull request #1842 from udnaan/master
32fadc0 Migrating documentation of the ObjectiveC runtime code to appledoc. (#1867)
1102a8a Merge pull request #1923 from bryongloden/patch-1
a375e1a close opened file descriptors properly
e30b7b4 Merge pull request #1924 from PiotrSikora/export_license
f1f30b5 Merge pull request #1926 from hotpxl/master
ddf6d1e [master] Add dependency cl. Fixes google/protobuf#295.
faea19c Bazel: export LICENSE file.
c59473d Merge pull request #1044 from mark-whiting/master
6d134ea Merge pull request #1898 from sergiocampama/watchos
ea081fe Fix missing import of jspb.Map (#1885)
0dca3cc Merge pull request #1865 from podsvirov/topic-cmake-project
dedd8ae Merge pull request #1914 from adamatan/typo-fix
3886860 Typo: beffer -> buffer
cf42b60 Merge pull request #1905 from pherl/fixdoc
eefd1fd CMake: Auto find ZLIB from package config if nedded
3916a0a Add and fix C++ runtime docs
8d8115b Merge pull request #1878 from haberman/rubywkt
e0779d5 Merge pull request #1903 from xfxyjwf/compatibility_tests
3cec2ea Ruby: added custom Struct exception type and fixed Makefile.am.
30647ca Use 64-bit protoc binaries in compatibility tests.
ff46627 Merge pull request #1902 from podsvirov/topic-cmake-extract-includes
7d275ec CMake: remove repeated_field_reflection.h from extract list
00d5a7f Amend the conformance tests to only use Int64/Uint64 non-wrapped values which (#1164)
275db04 Adds watch os deployment target for protobuf
6b3d120 Merge pull request #1887 from sheffatguidance/fix-js-api-documentation
1112989 Merge pull request #1884 from hochhaus/valueWriterFn
915d79e Merge pull request #1895 from google/3.0.0-GA
3ef0756 Merge pull request #1894 from pherl/fixdist
e139117 Add python/setup.cfg into dist files
169d0ca Merge pull request #1893 from google/3.0.0-GA
c479042 Merge pull request #1892 from xfxyjwf/compatibility_tests
baa4023 Run Java compatibility tests on Travis.
f3449e5 Merge pull request #1891 from pherl/python
811674f add setup.cfg for building wheels
42e5487 Merge pull request #1882 from legrosbuffle/fix-check
7e93458 Merge pull request #1888 from pherl/fixbuildzip
6a59ac9 Fix the build-zip.sh to add .exe for win packages.
9a11ab4 Fix Issue #1869: faulty js API documentation
a217408 Fix valueWriterFn variable name
c466f4b Be consistent with the use of CHECK()/ GOOGLE_CHECK().
a207a2b Fix for JRuby (assert_true is not present).
e3094a8 Ruby: added API support for well-known types.
de30c56 Merge pull request #1874 from pherl/buildzip
e3fac65 Change the build.zip.sh to support lite
2662fd3 Merge pull request #1871 from pherl/fixwin
1b1a8f4 Fix build protoc script for windows
33d2b2a CMake: Bugfix for protobuf_MODULE_COMPATIBLE
38c5f2f CMake: Link to ZLIB only if protobuf_WITH_ZLIB enabled
e8ae137 Merge pull request #1864 from pherl/galogs
e7982e4 Fixed Makefile.am for Ruby file rename.
e0d817e Change log for 3.0 GA release.
4e169bf Bring C#'s ToPascalCase method in line with C++. (This still doesn't fix the conformance tests, but at least we're now consistent with the C++ code.)
86535a1 Remove legacy_enum_values flag for GA.
be78976 Merge pull request #1861 from jskeet/fix_to_camel_case
af2fa05 Merge pull request #1859 from jskeet/remove-flag
a8aae89 Bring C#'s ToPascalCase method in line with C++. (This still doesn't fix the conformance tests, but at least we're now consistent with the C++ code.)
e672047 Remove legacy_enum_values flag for GA.
7ba044a Merge pull request #1853 from pherl/3.0.0-GA
7c9c314 fix comments.
0750797 Merge remote-tracking branch 'origin/3.0.0-GA' into 3.0.0-GA
54feb9a Fix the script comments.
b1aac0b Make protoc-artifacts able to build plugin.
032fb91 Merge branch 'master' of github.com:google/protobuf into 3.0.0-GA
154e278 fixed cmake config files install path
b6b521b Merge pull request #1851 from xfxyjwf/cint
12581b4 Fixes traivs cpp build.
234ec01 Merge pull request #1847 from haberman/GAfixes
43b36dd Fixed Makefile.am for Ruby file rename.
f11a4f1 Merge pull request #1841 from pherl/3.0.0-GA
b3b07cd Merge branch 'master' into 3.0.0-GA
ba52f2b Merge pull request #1788 from google/rubypackagecap
c43f718 Merge pull request #1846 from xfxyjwf/zip
97e2026 Added new file to ruby_EXTRA_DIST.
6d92233 Added unit test for PascalCasing package names in Ruby.
6cab568 Ruby: translate package names from snake_case -> PascalCase.
b553b87 Add a script to build protoc zip packages.
b1cecb6 Merge pull request #1837 from haberman/rubygencodename
0973822 remove extra zeros.
5a6c921 Make jruby still depend on beta-4
a17367f Define intX as standard exact-width integer types.
fb7a7c5 Bump version number for GA
4f19797 Ruby: generated foo.proto -> foo_pb.rb instead of foo.rb.
868ea59 Merge pull request #1831 from xfxyjwf/protoc
b6cd9dd Merge pull request #1834 from sergiocampama/framework
44bd6bd Merge pull request #1821 from haberman/rubyfreezestr
d07a996 Ruby: fixed string freezing for JRuby.
9e3c98f Fix maven path.
3a1259c Correctly sets the generate_for_named_framework option after parsing.
0622030 Merge pull request #1830 from xfxyjwf/travis
1b3796c Merge pull request #1829 from xfxyjwf/fixcpp
fdd970e Fix maven links.
9702b9f Keep cpp_distcheck on travis for now.
c2ced9a Remove linux tests from travis.
9009662 Fix sign-comparison warnings in public header files.
16adea3 Add a test to catch sign-comparison warnings.
4ddaad4 Merge pull request #1825 from xfxyjwf/jenkins2
de5236d Merge pull request #1828 from sergiocampama/framework
2ff9349 Fixes the parsing of the proto-framework map file.
20fbb35 Add more tests to jenkins.
2ba058c Merge pull request #1822 from xfxyjwf/java6
ff7f68a Ruby: encode and freeze strings when the are assigned or decoded.
ad49ed7 Update travis tests for Java.
30d8416 Merge pull request #1811 from xfxyjwf/fixdist
a4f68b1 Add missing files in EXTRA_DIST and add a test.
af8732e Merge pull request #1810 from xfxyjwf/versioning
e465f26 Merge pull request #1812 from jskeet/fix-travis
dd3d9d6 Merge pull request #1447 from seishun/defaults
73e0b49 fix debug.dump
deaea21 Use the dotnet-release package feed for Travis.
60cb094 Add files missing from "make dist".
36adb40 Update compatibility tests as well.
2e30301 Versioning Java GeneratedMessage.
e4b129f restore old behavior for toObject
db1b2a0 nits
970a4fd Make implicit defaults consistent with explicit defaults
77b08af Merge pull request #1802 from haberman/jsmapbin
e0e7377 Fix goog.require()/goog.provide() ordering.
24ac9c0 Merge pull request #1803 from xfxyjwf/javadoc
fa52702 Include javadoc/source in Java release packages.
7429b91 JavaScript: move extension binary info to separate struct.
2078f61 Merge remote-tracking branch 'origin/3.0.0-beta-4'
923eae8 JavaScript maps: move binary callbacks out of constructor.
56855f6 Merge pull request #1792 from xfxyjwf/changelog
82b43d1 Remove Java deterministic API.
b6a620d Merge pull request #1801 from thomasvl/oneof_framework_build_issues
2e98ed5 Use public methods to fetch oneofs in generated code.
3d9726f Mention Java lite in the changelog.
b99577c Exposes the currently registered extensions for a message and removes the internal sortedExtensionsInUse
f6d1d1a Uses head version of rvm to avoid shell_update_session not found error (#1791)
1349fb8 Added 3.0.0-beta-4 changelog.
3a8d8ea Merge pull request #1787 from xfxyjwf/steppingstone
1bce70d Fix compatiblity issues.
0b68255 Add missing golden test file.
e1f146b Merge pull request #1785 from jskeet/merge-csharp
b5ce525 Move to dotnet cli for building, and .NET Core (netstandard1.0) as target platform (#1727)
5e0de1e Remove the overload for Add(RepeatedField<T>)
2ee1e52 Optimize AddRange for sequences implementing ICollection
b053b92 Implement RepeatedField.AddRange.
d9334ea Improve exception throwing implementation in collections
10a8fb4 Move to dotnet cli for building, and .NET Core (netstandard1.0) as target platform (#1727)
4e0d051 Merge pull request #1781 from xfxyjwf/update_version
c2ebdec Update version number in AssemblyInfo.cs.
8b659b2 Merge pull request #1783 from xfxyjwf/fixlite
047a3b4 Exclude Java lite module from parent pom.xml
dd37b99 Comment out lite conformance test.
932f94e Update version number to 3.0.0-beta-4
06a0248 Add missing LIBPROTOBUF_EXPORT
7a7913e Add missing LIBPROTOBUF_EXPORT.
443eb27 Update generated files.
9086d96 Integrate from internal code base.
042993b Implement RepeatedField.AddRange (#1733)
8eb90e3 Merge pull request #1778 from yeswalrus/fix-prerelease-version
5520447 Fix a bad variable dereference causing <package>_FIND_VERSION_PRERELEASE to be ignored.
70c1ac7 Merge pull request #1776 from thomasvl/fix_dist
0d079bc Remove the baseline files from the make dist file list.
790e6af Fixed out-of-date documentation for CodedInputStream.ReadEnum. (#1581)
3560cc9 Merge pull request #1702 from lukebakken/csharp-nuget-doc-update
297ec74 Add https://metacpan.org/pod/Google::ProtocolBuffers::Dynamic
8779cba Merge pull request #1764 from jskeet/remove-is-value-type
3df146e Remove unnecessary reflection call
c404c2a Merge pull request #1762 from thomasvl/drop_perf_profiles
8c23655 Drop the performace baselines.
8b00675 Merge pull request #1757 from thomasvl/avoid_importing_src
03d9e09 Merge pull request #1735 from jskeet/attribute-placement
c850ebc Merge pull request #1758 from dago/pathmax2
e9a7fc8 Remove WriteGeneratedTypeAttributes which is a no-op
57638d5 Make sure also Solaris x86 gets PATH_MAX
eaf3451 Merge pull request #1753 from xfxyjwf/fixup
be0d7f6 Don't #import the .m files.
0d5091e Merge pull request #1742 from ottok/fix-spelling
d84d0ca Fix data member declaration order.
a29a9c5 Don't support global ::string in stringpiece.h
c10938a Merge pull request #1752 from acozzette/fix-js-tests
c64d86e Fixed failing JS tests
ec45897 Merge pull request #1712 from dkharrat/swift-error-handling
523bfd4 add nullable qualifier to nil return types
c534845 Changes to generated code from previous commit
ada0a81 Move DebuggerNonUserCodeAttribute to function members
d2ae496 Fix spelling error in function ParseTime parameter
3808d09 Fix spelling in strings and comments
cae3b0c Merge pull request #1704 from lizan/json_parse_options
02b55d2 Merge pull request #1738 from xfxyjwf/fixbuild
e102db1 Fix some failing travis tests.
aeff638 Merge pull request #1710 from chezRong/master
7e8c893 Merge pull request #1723 from thomasvl/objc_test_coverage
454dbf1 added minified JSON formatting functionality with test
2fe0556 Fix windows build.
69cc213 Updated failure_list_java.txt to remove tests that now pass
b83af52 Fixed string formatting in text_format.py to be Python2.6-compatible
d64a2d9 Integrated internal changes from Google
6cfc19e Xcode project cleanup/setup.
c18aa77 Validate the tag numbers when parsing. (#1725)
e0016c5 Merge pull request #1720 from thomasvl/issue_1716
31999a3 Add JsonParseOptions to ignore unknown fields
dc0aeaa Adding conditional compiler symbol to support .NET 3.5 (#1713)
fc4c617 Fix GPBGetMessage{Repeated,Map}Field()
7b5648c Merge pull request #1719 from esteluk/patch-1
3be6110 Fix Objective-C generator option typo
2bcd43a Merge pull request #1714 from dnkoutso/master
37ca94f Get value from text format name in GPBEnumDescriptor
0ab78e1 Merge pull request #1705 from haberman/revjsver
6a6f95d JS package.json: Added author and updated Closure Library version.
6f67be6 Merge pull request #1707 from jskeet/format-value
0421238 Expose JsonFormatter.WriteValue.
1dc6280 Moved all dependencies to devDependencies.
48735cb Add "google" to package.json "files" for WKT.
c4b40a3 Create patch release for JS to include WKT.
8069466 Modify csharp README since there are now two NuGet packages
a897ebb Merge pull request #1700 from jskeet/ordering
a230b5d Rename methods to avoid ObjC KVC collisions. (#1699)
e3f6e2b Remove ordering guarantees in the MapField documentation
1a5333b Adds destination flag to xcodebuild to avoid possible flake errors (#1697)
4f93098 Merge pull request #1666 from yeswalrus/cmake-prerelease-examples
b7560df Merge pull request #1696 from haberman/jswkt
8c20e55 Add new generation option for using proto sources from other frameworks.
104723f Fix tests for CommonJS.
98bd6d7 Merge pull request #1692 from vjpai/friendless
4308cc4 Added plugin.proto to well-known types for JS.
6daf3d2 Address review comments on function name
0e27112 Bugfix: base the require logic on the file being required.
1337486 JS: import well-known types from google-protobuf package.
37eaae2 Remove a friend-class template that is only used for the constructor, and instead create an _internal_only getter that gets the needed information. This is a workaround for a deficiency in gcc-4.4 that does not properly support templated friend classes.
52598c6 Merge pull request #1658 from yeswalrus/cmake-fixup-module
a5e116a Merge pull request #1665 from yeswalrus/cmake-package-requirements
f180ef6 Merge pull request #1683 from thomasvl/third_party_framework
a2a3399 Add support for generation sources into a framework.
f0c1492 Add the CocoaPods integration tests to Travis.
71f4a9c Fixes Xcode 8 analyzer warning saying that it was missing a release in dealloc (#1678)
088c5c4 Merge pull request #1664 from bshaffer/patch-1
4150a91 make protobuf_MSVC_STATIC_RUNTIME a dependent option to reflect it's use.
78b3498 Save the relevant options used to create a package, allow users to reject packages based on them.
5ebcfc1 Fix prerelease version matching to be more consistent with the find_package arguments.
f8a969d proper codeblock in README
fba7976 Merge pull request #879 from mathstuf/support-equals-in-proto-path
cadfbc8 Removed handling for ALIASED targets since they are unused.
401e07d Add GOOGLE_ prefix before PROTOBUF_DEPRECATED_ATTR
b60e615 Fix the undefined behavior for opensource users.
6aa981f Merge pull request #1624 from yeswalrus/cmake-prerelease-versioning
dfe0c9a Merge pull request #1643 from yeswalrus/cmake-examples
ed1d560 Merge pull request #1541 from haberman/conformancestrict
350453f Make surrogate regex even more lenient.
923d2c7 JSON surrogates Python: adjust regex for OSX error message.
23fef56 Replace handwritten protobuf-targets with exported version.
09f6a5c Use ExternalProject_Add to build the examples in a stand-alone fashion.
7155629 CMake project updates
6a61894 Added test for surrogates (valid and invalid).
84a1b60 Added update_failure_list.py.
4833b4c Surrogate checking is unpredictable, so always manually check.
bd98eae Fixed Python by updating failure lists and fixed a few broken tests.
ef7894e Make conformance tests more strict about the failure list.
20b5325 Integrate interanl changes
e34c091 Improving the granularity parsing errors (#1623)
0ab7a7f Merge pull request #1640 from os72/master
5977fb0 Generalize plugin support in Bazel Skylark rule
f6be0d1 Add https://github.com/os72/protobuf-dynamic
0420eab For prerelease versions, require protobuf_FIND_VERSION to be set.
f1091ab Include the prerelease version in the protobuf_VERSION
18a9140 Merge pull request #1625 from yeswalrus/note-versions
cc30be1 Merge pull request #1613 from yeswalrus/cmake-min-version
c461193 Merge pull request #1629 from zhongfq/patch-1
a315bb8 Merge pull request #1614 from yeswalrus/cmake-cleanup
e215828 Merge pull request #1630 from google/beta-3
e845187 Merge pull request #1620 from sergiocampama/cleanup1
4629659 add protobuf as3 lib and code generator for as3
457a297 Remove __PROTOBUF_PACKAGE_PREFIX
61c9696 Update the list of places where the version is stored.
a714c40 Removing unused GPBExceptionMessageKey
40ff94e Merge pull request #1617 from thomasvl/more_warnings
86e8f1f Merge pull request #1604 from jonwallg/repeated_types
38b9e74 Add -Woverriding-method-mismatch.
d13b3d0 remove useless cleanup - config.cmake files are executed in their own context.
c57c6ea Bump to the *real* minimum required version. Setting CMP0022 breaks CMake versions < 2.8.12
04265e4 Remove if(TRUE)
e72805e fix expected class checking in GPBSetMessageRepeatedField
0f27cab Merge pull request #1600 from thomasvl/objc_tighter_warnings
ed87c1f Merge pull request #1586 from davidzchen/python
02cd45c Bazel build: Keep generated sources and Python runtime in the same directory.
c8a440d Add more warnings to for the ObjC runtime build
d089f04 Merge pull request #1595 from thomasvl/objc_travis_tweaks
368a2f4 Automated testing tweaks for ObjC
5d0c2ee Merge pull request #1593 from thomasvl/framework_includes
173daf1 Merge pull request #1589 from hochhaus/master
7da023b Better support for using the proto library from a framework.
2131b2d Merge pull request #1588 from jeffmvr/master
7336092 Add js/binary/encoder.js to js_EXTRA_DIST.
733ef98 added missing closing bracket for _cmakedir_desc in cmake/install.cmake line 88
594ce56 Merge pull request #1578 from wal-rus/cmake-install-namespace
28f35b4 add protobuf:: namespace to installed targets
38e4713 Merge pull request #1523 from xfxyjwf/compatibility_tests
a31d14b Describe platform requirements for the compatibility tests.
5c6518f Merge pull request #1583 from thomasvl/pods_integration_followup
2338e03 Merge pull request #1576 from wal-rus/cmake-versionfile
beca1f5 Merge pull request #1575 from wal-rus/cmake-install-msvc
6c47faa Make the CocoaPods integration tests more robust
02a28a8 Update protobuf-config-version.cmake.in to correctly set the required variables (PACKAGE_VERSION_EXACT, PACKAGE_VERSION_COMPATIBLE, PACKAGE_VERSION_UNSUITABLE)
7d79458 Fix the cmake configuration file install path to be more standards compliant (See the description of cmake's config search behavior on https://cmake.org/cmake/help/v3.4/command/find_package.html)
c034ba7 Merge pull request #1574 from thomasvl/test_schemes
20b5bf6 Add shared schemes for the CocoaPods integration tests
12dffd9 Merge pull request #1572 from thomasvl/podspec_tests_2
16dd477 CocoaPod Integration Tests
cc5296b Merge pull request #1558 from haberman/rubyoneof
daec44f Expand the OS X/Xcode gitignores
2d514ce Fixed oneof behavior for enums and fixed JRuby.
b8ded18 Merge pull request #1561 from pherl/beta-3
d550528 Bump objc podspec version number
431ba4b Merge pull request #1549 from xyzzyz/arena_export
ba696e7 Merge pull request #1547 from xyzzyz/js_generator
cbb9183 Merge pull request #1559 from google/beta-3
545527e Ruby oneofs: return default instead of nil for unset fields.
32e3d7a Merge pull request #1412 from google/internal
6673283 Integrate interanl changes
718eb75 Merge pull request #1548 from anandolee/master
97aa8a0 Merge pull request #1551 from pherl/beta-3
fe06eb6 Fix protoc artifact pom version
b01b1a8 JSON format for Any message must print @type first, use OrderedDict instead of {}
4f630a6 Add compatiblity tests against v2.5.0
810ba9b Export class Arena to shared library.
f2885f6 Fix #include in js_generator.cc
67d2d45 Merge pull request #1546 from pherl/beta-3
b3bb46c Added download_url to be able to upload to pypi.
c8be6ee Merge pull request #1542 from google/beta-3
3470b68 Merge pull request #1540 from pherl/changelog
cf1fd7e Merge pull request #1533 from pherl/mapcomment
0ec34bf Update changes for lite
5e7c4cb Remove the comments about iterator validation
f2db1e0 Merge pull request #1532 from pherl/changelog
034867f Update changelogs for C++ maps
dc49706 Merge pull request #1529 from gkraynov/test-redundant-varint
5c29835 Merge pull request #1528 from pherl/master
b126706 Merge pull request #1527 from haberman/changelog
d346f49 Added release notes for Ruby and JavaScript.
ce7e502 Remove the instructions for pbconfig.h
7b87f77 Merge pull request #1521 from zhangkun83/master
04757db Fix the server id in example
9f84114 Merge pull request #1520 from pherl/hashmapvs2008
325cc42 Merge pull request #1522 from xfxyjwf/compatibility_notice
f9fd450 Merge pull request #1524 from anandolee/master
5d54a85 Test redundant varint fields decoding in JS.
371d341 Merge pull request #1518 from jskeet/move_test
e4ca694 python changes
09732c9 Add compatibility notice for Java.
28cb77f Fine-tune build scripts and better documentation.
e1f588a Merge pull request #1 from google/beta-3
71dd9c4 Merge pull request #1515 from pherl/changelog-beta3
ede9cc4 Update comments for csharp, zero-copy and objc.
19472bf Merge pull request #1512 from pherl/beta-3
017d390 Fix csharp version
dbdf6d9 Bridge vs2008 hashmaps.
5668e2e Fix typo.
920ee73 Merge pull request #1483 from wal-rus/fix-boost-incompatibility
7cc9cb4 Move test for standalone BoolValue to JsonParserTest
cca2d44 Merge pull request #1517 from jhickson/boolvalue
835fb94 Fixed parsing of BoolValue.
2b22b61 Merge remote-tracking branch 'refs/remotes/google/master'
c67879b Merge pull request #1514 from pherl/fix_heap_check
0e4d1ea Initial draft for changelog.
e8737d8 Fix the command line interface unittest again
cdd3ec7 Merge pull request #1513 from pherl/fix-build-protoc
25dd690 Fix protoc build artifact script.
5dea201 Update version numbers for other languages
dbed8a0 Update version numbers for beta3
a1938b2 Merge pull request #1510 from thomasvl/nonnull
4755bdc Declare an init and avoid passing NULL to initWithValue:count:
4c6259b Merge pull request #1498 from thomasvl/build_cleanup
f4bc9e0 Remove confounding and unused #define - breaks boost/predef/other/endian.h
d392ed4 Merge pull request #1502 from pherl/master
ce6bec7 Remove accidentally restored deleted files.
9d0f560 Merge pull request #1494 from pherl/master
76a96d4 Merge pull request #1499 from beardedN5rd/master
2eb774e after comment of Feng Xiao changed the entry to g++
ace2690 Merge pull request #1496 from ozkuran/master
b661fb5 Add the missing maintiner-clean entry for benchmarks
ce2ef0d Properly express all outputs for the conformance build
f367642 Add two missing ignores for conformance directory.
b12f630 updated README
9b3357d Updated README.md
5b5e369 Merge pull request #1471 from jskeet/any-host
f8a5c5f Fix using std::shared_ptr
75e5898 Fix the std::string error introduced in integration.
17b6fc3 Merge pull request #1409 from eeight/fix_enum_corruption
72e162c Merge pull request #1482 from nicolasnoble/rake-tweaks-2
edd2949 Properly generating well known proto files for the macos build.
07bcf21 Merge pull request #1464 from google/benchmarks
7dda312 Merge pull request #1473 from nicolasnoble/rake-tweaks
247ef1f Addressed PR comments.
09f1757 Merge pull request #1467 from pherl/master
236b939 Addressing concerns.
e0df23a Update descritpor protos for objc
aed7b34 Merge pull request #1474 from pherl/fixscript
454d5be Merge the script fix.
cf7e99d Fix cp -r usage to be portable.
1f8b6da Few tweaks to the rakefile to permit native gems compilation with the proto files generation.
b2d4b1a Fixed for pre-C++11 ifstream which does not accept std::string.
49a8918 Read files directly from filesystem since xxd isn't always available.
cb36bde Make the C++ tests build the benchmarking code.
1ce5bd8 Updates for PR comments.
61307b8 Allow custom type URL prefixes in Any.Pack
f86d39c Update file lists.
12fdeb9 Merge branch 'master' of github.com:google/protobuf
cf14183 Down integrate from Google internal.
b53417c Merge pull request #1462 from acozzette/ruby-2.3
30a2f70 Added README describing the directory.
2e83110 Added framework for generating/consuming benchmarking data sets.
cbb6b28 Merge pull request #1461 from thomasvl/fix_bool_handing
3064628 Fix up -hash/-isEqual: for bool storage.
bbb68fe Added dig and bsearch_index to RepeatedField methods forwarded to array
f53f911 Merge pull request #1455 from haberman/updateupb
66f0745 Merge pull request #1454 from thomasvl/enum_defaults
18b6a32 Proper checking of enum with non zero default
d419ca1 Updated upb and simplified ruby code a bit with new upb method.
4057447 Merge pull request #1444 from mbrtargeting/master
385755e Add initial design document for Swift protocol buffers. (#1442)
db93833 Added serialVersionUID to ExtendableMessage.
034294b Merge pull request #1438 from xfxyjwf/docs
f4f9aec Fix bug with silent message corruption in LITE_RUNTIME.
0ad2048 Merge pull request #1416 from cwhipkey/master
8052dc7 Add a docs directory and move the third-party add-ons page here.
462e7fa protoc: support '=' in --proto_path arguments
f00300d Merge pull request #1414 from xyzzyz/googletest
1ccb4ca Merge pull request #1434 from jskeet/regenerate
c588ac4 Regenerate well-known types for C#
e539e98 Merge pull request #1433 from thomasvl/check_wkt
511f28b ObjC support for failing the build in the generated WKTs are out of date
f265fb8 Merge pull request #1401 from jskeet/enum-casing
36978c3 Merge pull request #1417 from seishun/windows2
d90d615 Attempt to fix AppVeyor build by exporting GetEnumValueName
790f4c8 Use the original name in JSON formatting.
84ea2c7 Regenerate all C# code and make it compile
75626ed Add C# codegen changes to enum value names (mostly C++)
1dc8194 Merge pull request #1428 from pherl/master
1b0ff34 Add missing includes in field mask test
52825bf Merge pull request #1426 from thomasvl/fix_comment_typo
e664aa6 Regenerate the WKT to pick up current changes to the proto files.
83a7a5e Merge pull request #1402 from davidzchen/py2and3
3633bcd Fix comment typo
072296f Merge pull request #1422 from pherl/master
1f4f3e2 Update file list to include the missing extension lite file.
7ff229f Support Windows in gulpfile.js
ca9bbd7 Merge pull request #1413 from haberman/updateupb
baf52bd Change protobuf CPP proto generator to support the 'lite' option in proto3.
e67ef3d Bugfix for JSON error case.
800e986 Remove no longer applicable documentation from README.md.
194ad62 Ruby JSON: always accept both camelCase and original field names.
1b912fc Remove googletest.h header from stringprintf.cc
90c7f6e Documented the JSON change and compatibility flags.
94e54b3 Updated upb: picked up legacy JSON flags to help Ruby users migrate.
814685c Merge pull request #1397 from google/internal-merge
3c4ce52 Fix for gulpfile.js.
2a197b3 Use 0 as the default value for all enums, rather than finding the actual enum value name
3ffbdd7 Merge pull request #1400 from jskeet/fix-internal
5ebeefb Add missing PY2AND3 srcs_versions attributes to Python Bazel build targets.
0a902ee Fix to csharp_options - initialize internal_access to false.
2cd79bf Removed duplicated operator delete from merge conflict.
4465daa Merge branch 'master' into internal-merge
667f4a6 Merge pull request #1393 from gvaish/master
1523936 Fix for CommonJS tests.
09292d5 Merge pull request #1392 from anandolee/master
a6e3931 Added support for internal_access for C#
cef46e1 Merge pull request #1390 from jskeet/tidy-up
c612074 sync the Manually integrate changes in google3/third_party
28c5c25 Merge pull request #1391 from thomasvl/string_tweaks
f98d2f5 Updating Xcode Settings to use iOS 9.3
2a18bb5 Add more documentation for csharp_options.h
bfd1c84 Line-wrapping changes only for C# generator code
c9167f2 Error during parsing for invalid UTF-8 instead of dropping dropping data.
f3f5b3f Add tests to ensure we read strings with BOMs so we don't forget about lessons of the past.
89719f0 Merge pull request #1349 from gvaish/master
f3fe75b Merge pull request #1386 from andrewharp/patch-2
74d8b0b Added access_level for types
3b4e7dc Update BUILD
b56b461 Do not link in pthread library for Android builds.
a771c9e Merge pull request #852 from qzix/master
268ce28 Added deprecated option handling for objective-c generator
aeacf51 Merge pull request #1381 from pherl/internal-merge
7630a83 Merge branch 'master' into internal-merge
cba75ad Merge branch 'master' of github.com:google/protobuf
89343d8 Do not use C++11 unicode escape in unittest.
452e2b2 Merge pull request #1353 from keveman/master
cf828de Linking the cpp implementation extension statically with libprotobuf
93811ca Do not let windows.h define min/max macros
3b6df06 Allow bigobj for map_unittest
9d7a172 Merge pull request #1377 from jskeet/remove-duplicate-tests
a293180 Merge pull request #1378 from thomasvl/manual_stream_parsing_helpers
331cee5 Add -position and -isAtEnd for use when manually parsing input streams.
46e088e Remove duplicate test cases.
099ff1e Merge pull request #1369 from jskeet/tools-nuspec
203bb5e Fix re-definition issue of winsock.h and winsock2.h
1bf446c Disable sign-compare warning.
7b1cbbd Fix signed-compare warning.
012ac9a revert unexpected change for py26
bc1f2e7 Fix WIN32 build for map_test.
cbfd9d4 Remove export macros for classes nested in a template class.
81eb84c Merge pull request #1371 from keveman/oversize_protos
1283625 Added an API to allow oversize protos when using C++ extension in Python
5805c2d Fix javanano package
fc7eeda Fix json_format.py in py26
5c63266 Merge pull request #1366 from xyzzyz/int128_ostream
94aa50f Fix breakage of referring to table_ in static func
9e7fa06 Temporarily disable begin is fast test.
dfd4760 Remove duplicate line
ca0461c Introduce a new nuget package, Google.Protobuf.Tools, basically to contain protoc on multiple platforms.
a16c8a5 Merge pull request #1362 from jskeet/tweak_json_name
955841e Replace #include <iostream> with #include <ostream>
0de06f5 Merge branch 'master' of github.com:google/protobuf
3b3c8ab Integrate google internal changes.
a25e996 Merge pull request #1360 from pherl/master
34d0cc2 Merge pull request #1295 from haberman/docker
71e8dca Refactoring of FieldDescriptor
261fde1 Merge pull request #1326 from the-alien/csharp_json_name
a15b916 Merge branch 'master' into docker
e164f10 Use the T() instead of NULL for the default value.
af34538 Merge branch 'master' of https://github.com/google/protobuf into csharp_json_name
6f8dd21 Code review fixes
261ee02 Merge pull request #1358 from thomasvl/travis_tweaks
8d47d78 Mark iOS tests as able to fail.
9240acd Merge pull request #1350 from thomasvl/over_release
3f91744 The message was autoreleased, the -releases are an over release.
8126912 Merge pull request #1345 from smparkes/smparkes/well-known-protos
d5a5732 export well known protos
34eeeff Merge pull request #1344 from topillar/patch-1
64dfb5f Update coded_stream.h
698fa8e Merge pull request #1335 from pradeepg26/master
9209136 Merge pull request #1339 from thomasvl/delay_dispatch_semaphore_creation
bd41a39 Only create the readonlySemaphore on demand.
4d98369 Allow custom URLs for Any in JsonFormat
cab5eae Replace ancient m4/acx_pthread.m4 with m4/ax_pthread.m4
0d32ab3 csharp: add support for the json_name option
5e93384 Merge pull request #1325 from thomasvl/shrink_overhead
79a23c4 Shrink ObjC overhead (generated size and some runtime sizes)
ca3dc15 Merge pull request #1318 from smparkes/smparkes/grpc
44fdead Merge pull request #1291 from sergiocampama/devel
9aea0ef Merge pull request #1312 from petewarden/master
a9244ca add java/util support based on java/util/pom.xml
c71f184 Merge pull request #1278 from smparkes/master
dfaf1aa Merge pull request #1317 from benvanik/patch-1
58f0764 Fixing compilation error when building with emscripten.
ea18866 pass correct args to protoc for java wellknown protos when used as an external repository
bc2d6c2 Merge remote-tracking branch 'upstream/master'
f0c1a86 Added iOS settings to Bazel build
48ebb29 Merge pull request #1299 from tatraian/master
e2fb1d9 Comment has been added to fix (issue #1266)
a8db268 Merge pull request #1309 from thomasvl/leading_special_prop_names
1bf4b38 Fix up handing of fields with leading names that should be all caps.
3dd3238 Merge pull request #1306 from silviulica/master
4573edb Update version to 3.0.0b2.post2
6a8815b Merge pull request #1304 from thomasvl/headerdocs
36650a0 HeaderDoc support in the library and generated sources
f2d3408 Merge pull request #1301 from avgweb/master
ad2d775 Replace StringBuilder with TextWriter in JsonFormatter
9242d9b Merge pull request #1298 from craigcitro/fix_setup
3cc35ad Fix compiling clang/libc++ builds. (Issue: #1266)
0e7c0c2 Add back the namespace_packages arg in setup.py.
e70f925 Merge pull request #1139 from haberman/rubyjsoncamel
67c727c Rearranged and commented files for running under Jenkins.
37663e8 Merge pull request #1292 from haberman/ruby-allow-descriptor
35227b4 Removed the generated Ruby file from Makefile.am.
7d793c1 Disable attempt to use ccache for docker build.
2bda98f Properly report C++ build time.
1ee0fda Use a local Maven repository to avoid network fetches during tests.
513875d Generate well-known types in Ruby extension and prune unneeded proto2 dependencies.
b5a35b4 Adds more information to Objective C error when the expected objc_class_prefix option is missing.
2f3f1de Make Java copy into separate directories so the tests can run concurrently.
38bc155 Added code to generate XML output file for more granular results.
ffc8118 Added Ruby 2.1, Oracle Java, and C#.
78f9b68 Upgrade Python packages using pip.
f6153b5 Work around tox bug.
b28b3f6 Configure ccache directory.
d08c39c Put Maven in batch mode to avoid spamming the logs.
483533d Install Python deps in Docker image.
0b931bc Add another test (javanano), but run it in parallel.
0f8c25d Properly add JDK deps in the Docker image.
738393b Try running multiple tests in a row.
d33e93b Added ccache support.
57be1d7 Added some initial shell scripts and docker file.
7810589 Merge pull request #1260 from legrosbuffle/master
584233b Merge pull request #1287 from jskeet/fix-typo
f222a9a Fix copy/paste typo in CodedInputStreamTest
52f62e3 Merge pull request #1274 from murgatroid99/node_relative_requires
9f775a8 Merge pull request #1286 from jskeet/idisposable
c0cf71b Implement IDisposable for CodedInputStream and CodedOutputStream
60a0d41 Merge pull request #1233 from davidzchen/python-path
985c968 Remove hack for building Python support with Bazel.
fb714b3 Merge pull request #1275 from keveman/grpc_support
f5c7363 Fixed grpc C++ plugin support.
c9f8a1b Moved CommonJS-specific files into commonjs directory
a862b6b Fix CommonJS relative require generation, and test it
cc775f7 Merge pull request #1259 from silviulica/master
fc51bdc Merge pull request #1268 from keveman/grpc_support
cb39204 Updated library generation with iOS options
f0966a7 Added grpc plugin support to cc_proto_library.
8f67b16 Merge pull request #1267 from jskeet/vs2015
513a8a6 Merge pull request #804 from bsilver8192/master
4237146 Require VS2015 in the solution file
32daf51 Merge pull request #1215 from haberman/commonjs
24c5424 Added a bit more to README.md, and allowed custom PROTOC var in tests.
894c4d6 Merge pull request #1257 from thomasvl/objc_generics
b3d802d Make cpp generated enum constants constexpr when Options::proto_h is specified.
786f80f Add a modified patch from craigcitro@ to handle namespace sharing.
c003abb Merge pull request #1240 from jskeet/validate_group
4ab9186 Merge pull request #1258 from haberman/releasenotes
81e75c1 Some fixes for the most recent release notes.
f654d49 Updated upb from latest changes.
2480acb Support ObjC Generic Collections
78da666 Changed Ruby to properly camelCase its JSON by default.
907ad4a Properly camelCase when translating to CommonJS.
29d58d3 Removed unused directives from tests that aren't run under CommonJS.
c348af2 Addressed more code review comments.
7726cd2 Integrate review comments.
5195b7f Greatly expanded README.md.
59ea500 Use "node" as binary instead of "nodejs".
35298f9 Fixed definition of extensions, and added CommonJS tests to Travis.
77af5d0 Fixed nested message scopes for CommonJS.
d6a186a Added some documentation in comments.
9e60036 Moved CommonJS-specific files to commonjs/.
e9f31ee CommonJS tests are now passing.
55cc3aa WIP.
9ab11c6 Merge pull request #1255 from thomasvl/mark_os_x_python_cpp_failing
e0dd14c List python_cpp as failing on OS X
507213b Merge pull request #1254 from thomasvl/disable_xctool_updates
8c78450 Disable the xctool updates
abc09f7 Merge pull request #1239 from jskeet/call_generate_protos
c40f8c1 Merge pull request #1229 from keveman/unlimited_binary_proto
61e8e21 Merge pull request #1241 from jskeet/more-merge-wrapper-tests
d41db75 Merge pull request #260 from ejsd1989/issue-#242
8fc045d Merge pull request #1224 from google/rubysentinel
99a3e30 Added PROTOBUF_PYTHON_ALLOW_OVERSIZE_PROTOS macro and setting it when --allow_oversize_protos=true is passed to bazel build. When this macro is set, SetTotalBytesLimit is called to remove the 64MB limit on binary protos when during ParseFromString.
7cf5b81 Merge pull request #1247 from thomasvl/xctool_plain_output
efca368 Move the xctool use of -reporter into a common spot and always use "plain" to get more readable logs on travis.
ee819ea Merge pull request #1245 from thomasvl/tweak_xctool_ios_run
30e645b Tweak the xctool run for iOS tests to try and sort out flake
7d1cc10 Merge pull request #1244 from thomasvl/bump_xcode_version
1324119 Bump up travis to Xcode 7.2
0262e04 Add more tests around merging wrappers
9bdc848 Validate that end-group tags match their corresponding start-group tags
957e877 Generate C# code whenever descriptor.proto changes
69ac430 Removed 'optional' from proto3 syntax file.
fd1c289 Adding missing generic gcc 64-bit atomicops.
ff156e4 Add atomics support for 32-bit PPC.
914605c Removal of null check
a1c5e45 Removal of null check

Change-Id: Id88028efe92b49fd68a02de01e4f991765798c40
git-subtree-dir: third_party/protobuf
git-subtree-split: 48cb18e5c419ddd23d9badcfe4e9df7bde1979b2
diff --git a/python/MANIFEST.in b/python/MANIFEST.in
index 2608882..5fb0192 100644
--- a/python/MANIFEST.in
+++ b/python/MANIFEST.in
@@ -4,6 +4,9 @@
 exclude google/protobuf/internal/*.proto
 exclude google/protobuf/internal/test_util.py
 
+recursive-include google *.cc
+recursive-include google *.h
+
 recursive-exclude google *_test.py
 recursive-exclude google *_test.proto
 recursive-exclude google unittest*_pb2.py
diff --git a/python/README.md b/python/README.md
index 1b5b9df..4c19429 100644
--- a/python/README.md
+++ b/python/README.md
@@ -32,77 +32,84 @@
 
 1) Make sure you have Python 2.6 or newer.  If in doubt, run:
 
-     $ python -V
+       $ python -V
 
 2) If you do not have setuptools installed, note that it will be
-   downloaded and installed automatically as soon as you run setup.py.
+   downloaded and installed automatically as soon as you run `setup.py`.
    If you would rather install it manually, you may do so by following
-   the instructions on this page:
+   the instructions on [this page](https://packaging.python.org/en/latest/installing.html#setup-for-installing-packages).
 
-     https://packaging.python.org/en/latest/installing.html#setup-for-installing-packages
-
-3) Build the C++ code, or install a binary distribution of protoc.  If
+3) Build the C++ code, or install a binary distribution of `protoc`.  If
    you install a binary distribution, make sure that it is the same
    version as this package.  If in doubt, run:
 
-     $ protoc --version
+       $ protoc --version
 
 4) Build and run the tests:
 
-     $ python setup.py build
-     $ python setup.py test
+       $ python setup.py build
+       $ python setup.py test
 
-     To build, test, and use the C++ implementation, you must first compile
-     libprotobuf.so:
+   To build, test, and use the C++ implementation, you must first compile
+   `libprotobuf.so`:
 
-     $ (cd .. && make)
+       $ (cd .. && make)
 
-     On OS X:
+   On OS X:
 
-      If you are running a homebrew-provided python, you must make sure another
-      version of protobuf is not already installed, as homebrew's python will
-      search /usr/local/lib for libprotobuf.so before it searches ../src/.libs
-      You can either unlink homebrew's protobuf or install the libprotobuf you
-      built earlier:
+   If you are running a Homebrew-provided Python, you must make sure another
+   version of protobuf is not already installed, as Homebrew's Python will
+   search `/usr/local/lib` for `libprotobuf.so` before it searches
+   `../src/.libs`.
 
-      $ brew unlink protobuf
-      or
-      $ (cd .. && make install)
+   You can either unlink Homebrew's protobuf or install the `libprotobuf` you
+   built earlier:
 
-     On other *nix:
+       $ brew unlink protobuf
 
-      You must make libprotobuf.so dynamically available. You can either
-      install libprotobuf you built earlier, or set LD_LIBRARY_PATH:
+   or
 
-      $ export LD_LIBRARY_PATH=../src/.libs
-      or
-      $ (cd .. && make install)
+       $ (cd .. && make install)
 
-     To build the C++ implementation run:
-     $ python setup.py build --cpp_implementation
+    On other *nix:
 
-     Then run the tests like so:
-     $ python setup.py test --cpp_implementation
+    You must make `libprotobuf.so` dynamically available. You can either
+    install libprotobuf you built earlier, or set `LD_LIBRARY_PATH`:
+
+       $ export LD_LIBRARY_PATH=../src/.libs
+
+    or
+
+       $ (cd .. && make install)
+
+   To build the C++ implementation run:
+
+       $ python setup.py build --cpp_implementation
+
+   Then run the tests like so:
+
+       $ python setup.py test --cpp_implementation
 
    If some tests fail, this library may not work correctly on your
    system.  Continue at your own risk.
 
    Please note that there is a known problem with some versions of
    Python on Cygwin which causes the tests to fail after printing the
-   error:  "sem_init: Resource temporarily unavailable".  This appears
-   to be a bug either in Cygwin or in Python:
-     http://www.cygwin.com/ml/cygwin/2005-07/msg01378.html
-   We do not know if or when it might me fixed.  We also do not know
+   error:  `sem_init: Resource temporarily unavailable`.  This appears
+   to be a [bug either in Cygwin or in
+   Python](http://www.cygwin.com/ml/cygwin/2005-07/msg01378.html).
+
+   We do not know if or when it might be fixed.  We also do not know
    how likely it is that this bug will affect users in practice.
 
 5) Install:
 
-    $ python setup.py install
+       $ python setup.py install
 
-  or:
+   or:
 
-    $ (cd .. && make install)
-    $ python setup.py install --cpp_implementation
+       $ (cd .. && make install)
+       $ python setup.py install --cpp_implementation
 
    This step may require superuser privileges.
    NOTE: To use C++ implementation, you need to export an environment
@@ -123,13 +130,5 @@
 The C++ implementation for Python messages is built as a Python extension to
 improve the overall protobuf Python performance.
 
-To use the C++ implementation, you need to:
-1) Install the C++ protobuf runtime library, please see instructions in the
-   parent directory.
-2) Export an environment variable:
-
-    $ export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp
-
-You must set this variable at runtime, before running your program, otherwise
-the pure-Python implementation will be used. In a future release, we will
-change the default so that C++ implementation is used whenever it is available.
+To use the C++ implementation, you need to install the C++ protobuf runtime
+library, please see instructions in the parent directory.
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test1.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test1.proto
new file mode 100644
index 0000000..9f55e03
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test1.proto
@@ -0,0 +1,55 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: matthewtoia@google.com (Matt Toia)
+
+
+package google.protobuf.python.internal;
+
+
+enum Factory1Enum {
+  FACTORY_1_VALUE_0 = 0;
+  FACTORY_1_VALUE_1 = 1;
+}
+
+message Factory1Message {
+  optional Factory1Enum factory_1_enum = 1;
+  enum NestedFactory1Enum {
+    NESTED_FACTORY_1_VALUE_0 = 0;
+    NESTED_FACTORY_1_VALUE_1 = 1;
+  }
+  optional NestedFactory1Enum nested_factory_1_enum = 2;
+  message NestedFactory1Message {
+    optional string value = 1;
+  }
+  optional NestedFactory1Message nested_factory_1_message = 3;
+  optional int32 scalar_value = 4;
+  repeated string list_value = 5;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test2.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test2.proto
new file mode 100644
index 0000000..d3ce4d7
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/factory_test2.proto
@@ -0,0 +1,77 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: matthewtoia@google.com (Matt Toia)
+
+
+package google.protobuf.python.internal;
+
+import "google/protobuf/internal/factory_test1.proto";
+
+
+enum Factory2Enum {
+  FACTORY_2_VALUE_0 = 0;
+  FACTORY_2_VALUE_1 = 1;
+}
+
+message Factory2Message {
+  required int32 mandatory = 1;
+  optional Factory2Enum factory_2_enum = 2;
+  enum NestedFactory2Enum {
+    NESTED_FACTORY_2_VALUE_0 = 0;
+    NESTED_FACTORY_2_VALUE_1 = 1;
+  }
+  optional NestedFactory2Enum nested_factory_2_enum = 3;
+  message NestedFactory2Message {
+    optional string value = 1;
+  }
+  optional NestedFactory2Message nested_factory_2_message = 4;
+  optional Factory1Message factory_1_message = 5;
+  optional Factory1Enum factory_1_enum = 6;
+  optional Factory1Message.NestedFactory1Enum nested_factory_1_enum = 7;
+  optional Factory1Message.NestedFactory1Message nested_factory_1_message = 8;
+  optional Factory2Message circular_message = 9;
+  optional string scalar_value = 10;
+  repeated string list_value = 11;
+  repeated group Grouped = 12 {
+    optional string part_1 = 13;
+    optional string part_2 = 14;
+  }
+  optional LoopMessage loop = 15;
+  optional int32 int_with_default = 16 [default = 1776];
+  optional double double_with_default = 17 [default = 9.99];
+  optional string string_with_default = 18 [default = "hello world"];
+  optional bool bool_with_default = 19 [default = false];
+  optional Factory2Enum enum_with_default = 20 [default = FACTORY_2_VALUE_1];
+}
+
+message LoopMessage {
+  optional Factory2Message loop = 1;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions.proto
new file mode 100644
index 0000000..e2d9701
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions.proto
@@ -0,0 +1,58 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: robinson@google.com (Will Robinson)
+
+
+package google.protobuf.internal;
+
+
+message TopLevelMessage {
+  optional ExtendedMessage submessage = 1;
+}
+
+
+message ExtendedMessage {
+  extensions 1 to max;
+}
+
+
+message ForeignMessage {
+  optional int32 foreign_message_int = 1;
+}
+
+
+extend ExtendedMessage {
+  optional int32 optional_int_extension = 1;
+  optional ForeignMessage optional_message_extension = 2;
+
+  repeated int32 repeated_int_extension = 3;
+  repeated ForeignMessage repeated_message_extension = 4;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions_dynamic.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions_dynamic.proto
new file mode 100644
index 0000000..df98ac4
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_extensions_dynamic.proto
@@ -0,0 +1,49 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: jasonh@google.com (Jason Hsueh)
+//
+// This file is used to test a corner case in the CPP implementation where the
+// generated C++ type is available for the extendee, but the extension is
+// defined in a file whose C++ type is not in the binary.
+
+
+import "google/protobuf/internal/more_extensions.proto";
+
+package google.protobuf.internal;
+
+message DynamicMessageType {
+  optional int32 a = 1;
+}
+
+extend ExtendedMessage {
+  optional int32 dynamic_int32_extension = 100;
+  optional DynamicMessageType dynamic_message_extension = 101;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_messages.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_messages.proto
new file mode 100644
index 0000000..c701b44
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/more_messages.proto
@@ -0,0 +1,51 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: robinson@google.com (Will Robinson)
+
+
+package google.protobuf.internal;
+
+// A message where tag numbers are listed out of order, to allow us to test our
+// canonicalization of serialized output, which should always be in tag order.
+// We also mix in some extensions for extra fun.
+message OutOfOrderFields {
+  optional   sint32 optional_sint32   =  5;
+  extensions 4 to 4;
+  optional   uint32 optional_uint32   =  3;
+  extensions 2 to 2;
+  optional    int32 optional_int32    =  1;
+};
+
+
+extend OutOfOrderFields {
+  optional   uint64 optional_uint64   =  4;
+  optional    int64 optional_int64    =  2;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/test_bad_identifiers.proto b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/test_bad_identifiers.proto
new file mode 100644
index 0000000..6a82299
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/python/google/protobuf/internal/test_bad_identifiers.proto
@@ -0,0 +1,52 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+
+
+package protobuf_unittest;
+
+option py_generic_services = true;
+
+message TestBadIdentifiers {
+  extensions 100 to max;
+}
+
+// Make sure these reasonable extension names don't conflict with internal
+// variables.
+extend TestBadIdentifiers {
+  optional string message = 100 [default="foo"];
+  optional string descriptor = 101 [default="bar"];
+  optional string reflection = 102 [default="baz"];
+  optional string service = 103 [default="qux"];
+}
+
+message AnotherMessage {}
+service AnotherService {}
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/descriptor.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/descriptor.proto
new file mode 100644
index 0000000..a785f79
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/descriptor.proto
@@ -0,0 +1,620 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+//  Based on original Protocol Buffers design by
+//  Sanjay Ghemawat, Jeff Dean, and others.
+//
+// The messages in this file describe the definitions found in .proto files.
+// A valid .proto file can be translated directly to a FileDescriptorProto
+// without any other information (e.g. without reading its imports).
+
+
+
+package google.protobuf;
+option java_package = "com.google.protobuf";
+option java_outer_classname = "DescriptorProtos";
+
+// descriptor.proto must be optimized for speed because reflection-based
+// algorithms don't work during bootstrapping.
+option optimize_for = SPEED;
+
+// The protocol compiler can output a FileDescriptorSet containing the .proto
+// files it parses.
+message FileDescriptorSet {
+  repeated FileDescriptorProto file = 1;
+}
+
+// Describes a complete .proto file.
+message FileDescriptorProto {
+  optional string name = 1;       // file name, relative to root of source tree
+  optional string package = 2;    // e.g. "foo", "foo.bar", etc.
+
+  // Names of files imported by this file.
+  repeated string dependency = 3;
+  // Indexes of the public imported files in the dependency list above.
+  repeated int32 public_dependency = 10;
+  // Indexes of the weak imported files in the dependency list.
+  // For Google-internal migration only. Do not use.
+  repeated int32 weak_dependency = 11;
+
+  // All top-level definitions in this file.
+  repeated DescriptorProto message_type = 4;
+  repeated EnumDescriptorProto enum_type = 5;
+  repeated ServiceDescriptorProto service = 6;
+  repeated FieldDescriptorProto extension = 7;
+
+  optional FileOptions options = 8;
+
+  // This field contains optional information about the original source code.
+  // You may safely remove this entire field whithout harming runtime
+  // functionality of the descriptors -- the information is needed only by
+  // development tools.
+  optional SourceCodeInfo source_code_info = 9;
+}
+
+// Describes a message type.
+message DescriptorProto {
+  optional string name = 1;
+
+  repeated FieldDescriptorProto field = 2;
+  repeated FieldDescriptorProto extension = 6;
+
+  repeated DescriptorProto nested_type = 3;
+  repeated EnumDescriptorProto enum_type = 4;
+
+  message ExtensionRange {
+    optional int32 start = 1;
+    optional int32 end = 2;
+  }
+  repeated ExtensionRange extension_range = 5;
+
+  optional MessageOptions options = 7;
+}
+
+// Describes a field within a message.
+message FieldDescriptorProto {
+  enum Type {
+    // 0 is reserved for errors.
+    // Order is weird for historical reasons.
+    TYPE_DOUBLE         = 1;
+    TYPE_FLOAT          = 2;
+    // Not ZigZag encoded.  Negative numbers take 10 bytes.  Use TYPE_SINT64 if
+    // negative values are likely.
+    TYPE_INT64          = 3;
+    TYPE_UINT64         = 4;
+    // Not ZigZag encoded.  Negative numbers take 10 bytes.  Use TYPE_SINT32 if
+    // negative values are likely.
+    TYPE_INT32          = 5;
+    TYPE_FIXED64        = 6;
+    TYPE_FIXED32        = 7;
+    TYPE_BOOL           = 8;
+    TYPE_STRING         = 9;
+    TYPE_GROUP          = 10;  // Tag-delimited aggregate.
+    TYPE_MESSAGE        = 11;  // Length-delimited aggregate.
+
+    // New in version 2.
+    TYPE_BYTES          = 12;
+    TYPE_UINT32         = 13;
+    TYPE_ENUM           = 14;
+    TYPE_SFIXED32       = 15;
+    TYPE_SFIXED64       = 16;
+    TYPE_SINT32         = 17;  // Uses ZigZag encoding.
+    TYPE_SINT64         = 18;  // Uses ZigZag encoding.
+  };
+
+  enum Label {
+    // 0 is reserved for errors
+    LABEL_OPTIONAL      = 1;
+    LABEL_REQUIRED      = 2;
+    LABEL_REPEATED      = 3;
+    // TODO(sanjay): Should we add LABEL_MAP?
+  };
+
+  optional string name = 1;
+  optional int32 number = 3;
+  optional Label label = 4;
+
+  // If type_name is set, this need not be set.  If both this and type_name
+  // are set, this must be either TYPE_ENUM or TYPE_MESSAGE.
+  optional Type type = 5;
+
+  // For message and enum types, this is the name of the type.  If the name
+  // starts with a '.', it is fully-qualified.  Otherwise, C++-like scoping
+  // rules are used to find the type (i.e. first the nested types within this
+  // message are searched, then within the parent, on up to the root
+  // namespace).
+  optional string type_name = 6;
+
+  // For extensions, this is the name of the type being extended.  It is
+  // resolved in the same manner as type_name.
+  optional string extendee = 2;
+
+  // For numeric types, contains the original text representation of the value.
+  // For booleans, "true" or "false".
+  // For strings, contains the default text contents (not escaped in any way).
+  // For bytes, contains the C escaped value.  All bytes >= 128 are escaped.
+  // TODO(kenton):  Base-64 encode?
+  optional string default_value = 7;
+
+  optional FieldOptions options = 8;
+}
+
+// Describes an enum type.
+message EnumDescriptorProto {
+  optional string name = 1;
+
+  repeated EnumValueDescriptorProto value = 2;
+
+  optional EnumOptions options = 3;
+}
+
+// Describes a value within an enum.
+message EnumValueDescriptorProto {
+  optional string name = 1;
+  optional int32 number = 2;
+
+  optional EnumValueOptions options = 3;
+}
+
+// Describes a service.
+message ServiceDescriptorProto {
+  optional string name = 1;
+  repeated MethodDescriptorProto method = 2;
+
+  optional ServiceOptions options = 3;
+}
+
+// Describes a method of a service.
+message MethodDescriptorProto {
+  optional string name = 1;
+
+  // Input and output type names.  These are resolved in the same way as
+  // FieldDescriptorProto.type_name, but must refer to a message type.
+  optional string input_type = 2;
+  optional string output_type = 3;
+
+  optional MethodOptions options = 4;
+}
+
+
+// ===================================================================
+// Options
+
+// Each of the definitions above may have "options" attached.  These are
+// just annotations which may cause code to be generated slightly differently
+// or may contain hints for code that manipulates protocol messages.
+//
+// Clients may define custom options as extensions of the *Options messages.
+// These extensions may not yet be known at parsing time, so the parser cannot
+// store the values in them.  Instead it stores them in a field in the *Options
+// message called uninterpreted_option. This field must have the same name
+// across all *Options messages. We then use this field to populate the
+// extensions when we build a descriptor, at which point all protos have been
+// parsed and so all extensions are known.
+//
+// Extension numbers for custom options may be chosen as follows:
+// * For options which will only be used within a single application or
+//   organization, or for experimental options, use field numbers 50000
+//   through 99999.  It is up to you to ensure that you do not use the
+//   same number for multiple options.
+// * For options which will be published and used publicly by multiple
+//   independent entities, e-mail protobuf-global-extension-registry@google.com
+//   to reserve extension numbers. Simply provide your project name (e.g.
+//   Object-C plugin) and your porject website (if available) -- there's no need
+//   to explain how you intend to use them. Usually you only need one extension
+//   number. You can declare multiple options with only one extension number by
+//   putting them in a sub-message. See the Custom Options section of the docs
+//   for examples:
+//   http://code.google.com/apis/protocolbuffers/docs/proto.html#options
+//   If this turns out to be popular, a web service will be set up
+//   to automatically assign option numbers.
+
+
+message FileOptions {
+
+  // Sets the Java package where classes generated from this .proto will be
+  // placed.  By default, the proto package is used, but this is often
+  // inappropriate because proto packages do not normally start with backwards
+  // domain names.
+  optional string java_package = 1;
+
+
+  // If set, all the classes from the .proto file are wrapped in a single
+  // outer class with the given name.  This applies to both Proto1
+  // (equivalent to the old "--one_java_file" option) and Proto2 (where
+  // a .proto always translates to a single class, but you may want to
+  // explicitly choose the class name).
+  optional string java_outer_classname = 8;
+
+  // If set true, then the Java code generator will generate a separate .java
+  // file for each top-level message, enum, and service defined in the .proto
+  // file.  Thus, these types will *not* be nested inside the outer class
+  // named by java_outer_classname.  However, the outer class will still be
+  // generated to contain the file's getDescriptor() method as well as any
+  // top-level extensions defined in the file.
+  optional bool java_multiple_files = 10 [default=false];
+
+  // If set true, then the Java code generator will generate equals() and
+  // hashCode() methods for all messages defined in the .proto file. This is
+  // purely a speed optimization, as the AbstractMessage base class includes
+  // reflection-based implementations of these methods.
+  optional bool java_generate_equals_and_hash = 20 [default=false];
+
+  // Generated classes can be optimized for speed or code size.
+  enum OptimizeMode {
+    SPEED = 1;        // Generate complete code for parsing, serialization,
+                      // etc.
+    CODE_SIZE = 2;    // Use ReflectionOps to implement these methods.
+    LITE_RUNTIME = 3; // Generate code using MessageLite and the lite runtime.
+  }
+  optional OptimizeMode optimize_for = 9 [default=SPEED];
+
+  // Sets the Go package where structs generated from this .proto will be
+  // placed.  There is no default.
+  optional string go_package = 11;
+
+
+
+  // Should generic services be generated in each language?  "Generic" services
+  // are not specific to any particular RPC system.  They are generated by the
+  // main code generators in each language (without additional plugins).
+  // Generic services were the only kind of service generation supported by
+  // early versions of proto2.
+  //
+  // Generic services are now considered deprecated in favor of using plugins
+  // that generate code specific to your particular RPC system.  Therefore,
+  // these default to false.  Old code which depends on generic services should
+  // explicitly set them to true.
+  optional bool cc_generic_services = 16 [default=false];
+  optional bool java_generic_services = 17 [default=false];
+  optional bool py_generic_services = 18 [default=false];
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message MessageOptions {
+  // Set true to use the old proto1 MessageSet wire format for extensions.
+  // This is provided for backwards-compatibility with the MessageSet wire
+  // format.  You should not use this for any other reason:  It's less
+  // efficient, has fewer features, and is more complicated.
+  //
+  // The message must be defined exactly as follows:
+  //   message Foo {
+  //     option message_set_wire_format = true;
+  //     extensions 4 to max;
+  //   }
+  // Note that the message cannot have any defined fields; MessageSets only
+  // have extensions.
+  //
+  // All extensions of your type must be singular messages; e.g. they cannot
+  // be int32s, enums, or repeated messages.
+  //
+  // Because this is an option, the above two restrictions are not enforced by
+  // the protocol compiler.
+  optional bool message_set_wire_format = 1 [default=false];
+
+  // Disables the generation of the standard "descriptor()" accessor, which can
+  // conflict with a field of the same name.  This is meant to make migration
+  // from proto1 easier; new code should avoid fields named "descriptor".
+  optional bool no_standard_descriptor_accessor = 2 [default=false];
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message FieldOptions {
+  // The ctype option instructs the C++ code generator to use a different
+  // representation of the field than it normally would.  See the specific
+  // options below.  This option is not yet implemented in the open source
+  // release -- sorry, we'll try to include it in a future version!
+  optional CType ctype = 1 [default = STRING];
+  enum CType {
+    // Default mode.
+    STRING = 0;
+
+    CORD = 1;
+
+    STRING_PIECE = 2;
+  }
+  // The packed option can be enabled for repeated primitive fields to enable
+  // a more efficient representation on the wire. Rather than repeatedly
+  // writing the tag and type for each element, the entire array is encoded as
+  // a single length-delimited blob.
+  optional bool packed = 2;
+
+
+
+  // Should this field be parsed lazily?  Lazy applies only to message-type
+  // fields.  It means that when the outer message is initially parsed, the
+  // inner message's contents will not be parsed but instead stored in encoded
+  // form.  The inner message will actually be parsed when it is first accessed.
+  //
+  // This is only a hint.  Implementations are free to choose whether to use
+  // eager or lazy parsing regardless of the value of this option.  However,
+  // setting this option true suggests that the protocol author believes that
+  // using lazy parsing on this field is worth the additional bookkeeping
+  // overhead typically needed to implement it.
+  //
+  // This option does not affect the public interface of any generated code;
+  // all method signatures remain the same.  Furthermore, thread-safety of the
+  // interface is not affected by this option; const methods remain safe to
+  // call from multiple threads concurrently, while non-const methods continue
+  // to require exclusive access.
+  //
+  //
+  // Note that implementations may choose not to check required fields within
+  // a lazy sub-message.  That is, calling IsInitialized() on the outher message
+  // may return true even if the inner message has missing required fields.
+  // This is necessary because otherwise the inner message would have to be
+  // parsed in order to perform the check, defeating the purpose of lazy
+  // parsing.  An implementation which chooses not to check required fields
+  // must be consistent about it.  That is, for any particular sub-message, the
+  // implementation must either *always* check its required fields, or *never*
+  // check its required fields, regardless of whether or not the message has
+  // been parsed.
+  optional bool lazy = 5 [default=false];
+
+  // Is this field deprecated?
+  // Depending on the target platform, this can emit Deprecated annotations
+  // for accessors, or it will be completely ignored; in the very least, this
+  // is a formalization for deprecating fields.
+  optional bool deprecated = 3 [default=false];
+
+  // EXPERIMENTAL.  DO NOT USE.
+  // For "map" fields, the name of the field in the enclosed type that
+  // is the key for this map.  For example, suppose we have:
+  //   message Item {
+  //     required string name = 1;
+  //     required string value = 2;
+  //   }
+  //   message Config {
+  //     repeated Item items = 1 [experimental_map_key="name"];
+  //   }
+  // In this situation, the map key for Item will be set to "name".
+  // TODO: Fully-implement this, then remove the "experimental_" prefix.
+  optional string experimental_map_key = 9;
+
+  // For Google-internal migration only. Do not use.
+  optional bool weak = 10 [default=false];
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message EnumOptions {
+
+  // Set this option to false to disallow mapping different tag names to a same
+  // value.
+  optional bool allow_alias = 2 [default=true];
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message EnumValueOptions {
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message ServiceOptions {
+
+  // Note:  Field numbers 1 through 32 are reserved for Google's internal RPC
+  //   framework.  We apologize for hoarding these numbers to ourselves, but
+  //   we were already using them long before we decided to release Protocol
+  //   Buffers.
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+message MethodOptions {
+
+  // Note:  Field numbers 1 through 32 are reserved for Google's internal RPC
+  //   framework.  We apologize for hoarding these numbers to ourselves, but
+  //   we were already using them long before we decided to release Protocol
+  //   Buffers.
+
+  // The parser stores options it doesn't recognize here. See above.
+  repeated UninterpretedOption uninterpreted_option = 999;
+
+  // Clients can define custom options in extensions of this message. See above.
+  extensions 1000 to max;
+}
+
+
+// A message representing a option the parser does not recognize. This only
+// appears in options protos created by the compiler::Parser class.
+// DescriptorPool resolves these when building Descriptor objects. Therefore,
+// options protos in descriptor objects (e.g. returned by Descriptor::options(),
+// or produced by Descriptor::CopyTo()) will never have UninterpretedOptions
+// in them.
+message UninterpretedOption {
+  // The name of the uninterpreted option.  Each string represents a segment in
+  // a dot-separated name.  is_extension is true iff a segment represents an
+  // extension (denoted with parentheses in options specs in .proto files).
+  // E.g.,{ ["foo", false], ["bar.baz", true], ["qux", false] } represents
+  // "foo.(bar.baz).qux".
+  message NamePart {
+    required string name_part = 1;
+    required bool is_extension = 2;
+  }
+  repeated NamePart name = 2;
+
+  // The value of the uninterpreted option, in whatever type the tokenizer
+  // identified it as during parsing. Exactly one of these should be set.
+  optional string identifier_value = 3;
+  optional uint64 positive_int_value = 4;
+  optional int64 negative_int_value = 5;
+  optional double double_value = 6;
+  optional bytes string_value = 7;
+  optional string aggregate_value = 8;
+}
+
+// ===================================================================
+// Optional source code info
+
+// Encapsulates information about the original source file from which a
+// FileDescriptorProto was generated.
+message SourceCodeInfo {
+  // A Location identifies a piece of source code in a .proto file which
+  // corresponds to a particular definition.  This information is intended
+  // to be useful to IDEs, code indexers, documentation generators, and similar
+  // tools.
+  //
+  // For example, say we have a file like:
+  //   message Foo {
+  //     optional string foo = 1;
+  //   }
+  // Let's look at just the field definition:
+  //   optional string foo = 1;
+  //   ^       ^^     ^^  ^  ^^^
+  //   a       bc     de  f  ghi
+  // We have the following locations:
+  //   span   path               represents
+  //   [a,i)  [ 4, 0, 2, 0 ]     The whole field definition.
+  //   [a,b)  [ 4, 0, 2, 0, 4 ]  The label (optional).
+  //   [c,d)  [ 4, 0, 2, 0, 5 ]  The type (string).
+  //   [e,f)  [ 4, 0, 2, 0, 1 ]  The name (foo).
+  //   [g,h)  [ 4, 0, 2, 0, 3 ]  The number (1).
+  //
+  // Notes:
+  // - A location may refer to a repeated field itself (i.e. not to any
+  //   particular index within it).  This is used whenever a set of elements are
+  //   logically enclosed in a single code segment.  For example, an entire
+  //   extend block (possibly containing multiple extension definitions) will
+  //   have an outer location whose path refers to the "extensions" repeated
+  //   field without an index.
+  // - Multiple locations may have the same path.  This happens when a single
+  //   logical declaration is spread out across multiple places.  The most
+  //   obvious example is the "extend" block again -- there may be multiple
+  //   extend blocks in the same scope, each of which will have the same path.
+  // - A location's span is not always a subset of its parent's span.  For
+  //   example, the "extendee" of an extension declaration appears at the
+  //   beginning of the "extend" block and is shared by all extensions within
+  //   the block.
+  // - Just because a location's span is a subset of some other location's span
+  //   does not mean that it is a descendent.  For example, a "group" defines
+  //   both a type and a field in a single declaration.  Thus, the locations
+  //   corresponding to the type and field and their components will overlap.
+  // - Code which tries to interpret locations should probably be designed to
+  //   ignore those that it doesn't understand, as more types of locations could
+  //   be recorded in the future.
+  repeated Location location = 1;
+  message Location {
+    // Identifies which part of the FileDescriptorProto was defined at this
+    // location.
+    //
+    // Each element is a field number or an index.  They form a path from
+    // the root FileDescriptorProto to the place where the definition.  For
+    // example, this path:
+    //   [ 4, 3, 2, 7, 1 ]
+    // refers to:
+    //   file.message_type(3)  // 4, 3
+    //       .field(7)         // 2, 7
+    //       .name()           // 1
+    // This is because FileDescriptorProto.message_type has field number 4:
+    //   repeated DescriptorProto message_type = 4;
+    // and DescriptorProto.field has field number 2:
+    //   repeated FieldDescriptorProto field = 2;
+    // and FieldDescriptorProto.name has field number 1:
+    //   optional string name = 1;
+    //
+    // Thus, the above path gives the location of a field name.  If we removed
+    // the last element:
+    //   [ 4, 3, 2, 7 ]
+    // this path refers to the whole field declaration (from the beginning
+    // of the label to the terminating semicolon).
+    repeated int32 path = 1 [packed=true];
+
+    // Always has exactly three or four elements: start line, start column,
+    // end line (optional, otherwise assumed same as start line), end column.
+    // These are packed into a single field for efficiency.  Note that line
+    // and column numbers are zero-based -- typically you will want to add
+    // 1 to each before displaying to a user.
+    repeated int32 span = 2 [packed=true];
+
+    // If this SourceCodeInfo represents a complete declaration, these are any
+    // comments appearing before and after the declaration which appear to be
+    // attached to the declaration.
+    //
+    // A series of line comments appearing on consecutive lines, with no other
+    // tokens appearing on those lines, will be treated as a single comment.
+    //
+    // Only the comment content is provided; comment markers (e.g. //) are
+    // stripped out.  For block comments, leading whitespace and an asterisk
+    // will be stripped from the beginning of each line other than the first.
+    // Newlines are included in the output.
+    //
+    // Examples:
+    //
+    //   optional int32 foo = 1;  // Comment attached to foo.
+    //   // Comment attached to bar.
+    //   optional int32 bar = 2;
+    //
+    //   optional string baz = 3;
+    //   // Comment attached to baz.
+    //   // Another line attached to baz.
+    //
+    //   // Comment attached to qux.
+    //   //
+    //   // Another line attached to qux.
+    //   optional double qux = 4;
+    //
+    //   optional string corge = 5;
+    //   /* Block comment attached
+    //    * to corge.  Leading asterisks
+    //    * will be removed. */
+    //   /* Block comment attached to
+    //    * grault. */
+    //   optional int32 grault = 6;
+    optional string leading_comments = 3;
+    optional string trailing_comments = 4;
+  }
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest.proto
new file mode 100644
index 0000000..6eb2d86
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest.proto
@@ -0,0 +1,719 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+//  Based on original Protocol Buffers design by
+//  Sanjay Ghemawat, Jeff Dean, and others.
+//
+// A proto file we will use for unit testing.
+
+
+// Some generic_services option(s) added automatically.
+// See:  http://go/proto2-generic-services-default
+option cc_generic_services = true;     // auto-added
+option java_generic_services = true;   // auto-added
+option py_generic_services = true;     // auto-added
+
+import "google/protobuf/unittest_import.proto";
+
+// We don't put this in a package within proto2 because we need to make sure
+// that the generated code doesn't depend on being in the proto2 namespace.
+// In test_util.h we do "using namespace unittest = protobuf_unittest".
+package protobuf_unittest;
+
+// Protos optimized for SPEED use a strict superset of the generated code
+// of equivalent ones optimized for CODE_SIZE, so we should optimize all our
+// tests for speed unless explicitly testing code size optimization.
+option optimize_for = SPEED;
+
+option java_outer_classname = "UnittestProto";
+
+// This proto includes every type of field in both singular and repeated
+// forms.
+message TestAllTypes {
+  message NestedMessage {
+    // The field name "b" fails to compile in proto1 because it conflicts with
+    // a local variable named "b" in one of the generated methods.  Doh.
+    // This file needs to compile in proto1 to test backwards-compatibility.
+    optional int32 bb = 1;
+  }
+
+  enum NestedEnum {
+    FOO = 1;
+    BAR = 2;
+    BAZ = 3;
+  }
+
+  // Singular
+  optional    int32 optional_int32    =  1;
+  optional    int64 optional_int64    =  2;
+  optional   uint32 optional_uint32   =  3;
+  optional   uint64 optional_uint64   =  4;
+  optional   sint32 optional_sint32   =  5;
+  optional   sint64 optional_sint64   =  6;
+  optional  fixed32 optional_fixed32  =  7;
+  optional  fixed64 optional_fixed64  =  8;
+  optional sfixed32 optional_sfixed32 =  9;
+  optional sfixed64 optional_sfixed64 = 10;
+  optional    float optional_float    = 11;
+  optional   double optional_double   = 12;
+  optional     bool optional_bool     = 13;
+  optional   string optional_string   = 14;
+  optional    bytes optional_bytes    = 15;
+
+  optional group OptionalGroup = 16 {
+    optional int32 a = 17;
+  }
+
+  optional NestedMessage                        optional_nested_message  = 18;
+  optional ForeignMessage                       optional_foreign_message = 19;
+  optional protobuf_unittest_import.ImportMessage optional_import_message  = 20;
+
+  optional NestedEnum                           optional_nested_enum     = 21;
+  optional ForeignEnum                          optional_foreign_enum    = 22;
+  optional protobuf_unittest_import.ImportEnum    optional_import_enum     = 23;
+
+  optional string optional_string_piece = 24 [ctype=STRING_PIECE];
+  optional string optional_cord = 25 [ctype=CORD];
+
+  // Defined in unittest_import_public.proto
+  optional protobuf_unittest_import.PublicImportMessage
+      optional_public_import_message = 26;
+
+  optional NestedMessage optional_lazy_message = 27 [lazy=true];
+
+  // Repeated
+  repeated    int32 repeated_int32    = 31;
+  repeated    int64 repeated_int64    = 32;
+  repeated   uint32 repeated_uint32   = 33;
+  repeated   uint64 repeated_uint64   = 34;
+  repeated   sint32 repeated_sint32   = 35;
+  repeated   sint64 repeated_sint64   = 36;
+  repeated  fixed32 repeated_fixed32  = 37;
+  repeated  fixed64 repeated_fixed64  = 38;
+  repeated sfixed32 repeated_sfixed32 = 39;
+  repeated sfixed64 repeated_sfixed64 = 40;
+  repeated    float repeated_float    = 41;
+  repeated   double repeated_double   = 42;
+  repeated     bool repeated_bool     = 43;
+  repeated   string repeated_string   = 44;
+  repeated    bytes repeated_bytes    = 45;
+
+  repeated group RepeatedGroup = 46 {
+    optional int32 a = 47;
+  }
+
+  repeated NestedMessage                        repeated_nested_message  = 48;
+  repeated ForeignMessage                       repeated_foreign_message = 49;
+  repeated protobuf_unittest_import.ImportMessage repeated_import_message  = 50;
+
+  repeated NestedEnum                           repeated_nested_enum     = 51;
+  repeated ForeignEnum                          repeated_foreign_enum    = 52;
+  repeated protobuf_unittest_import.ImportEnum    repeated_import_enum     = 53;
+
+  repeated string repeated_string_piece = 54 [ctype=STRING_PIECE];
+  repeated string repeated_cord = 55 [ctype=CORD];
+
+  repeated NestedMessage repeated_lazy_message = 57 [lazy=true];
+
+  // Singular with defaults
+  optional    int32 default_int32    = 61 [default =  41    ];
+  optional    int64 default_int64    = 62 [default =  42    ];
+  optional   uint32 default_uint32   = 63 [default =  43    ];
+  optional   uint64 default_uint64   = 64 [default =  44    ];
+  optional   sint32 default_sint32   = 65 [default = -45    ];
+  optional   sint64 default_sint64   = 66 [default =  46    ];
+  optional  fixed32 default_fixed32  = 67 [default =  47    ];
+  optional  fixed64 default_fixed64  = 68 [default =  48    ];
+  optional sfixed32 default_sfixed32 = 69 [default =  49    ];
+  optional sfixed64 default_sfixed64 = 70 [default = -50    ];
+  optional    float default_float    = 71 [default =  51.5  ];
+  optional   double default_double   = 72 [default =  52e3  ];
+  optional     bool default_bool     = 73 [default = true   ];
+  optional   string default_string   = 74 [default = "hello"];
+  optional    bytes default_bytes    = 75 [default = "world"];
+
+  optional NestedEnum  default_nested_enum  = 81 [default = BAR        ];
+  optional ForeignEnum default_foreign_enum = 82 [default = FOREIGN_BAR];
+  optional protobuf_unittest_import.ImportEnum
+      default_import_enum = 83 [default = IMPORT_BAR];
+
+  optional string default_string_piece = 84 [ctype=STRING_PIECE,default="abc"];
+  optional string default_cord = 85 [ctype=CORD,default="123"];
+}
+
+message TestDeprecatedFields {
+  optional int32 deprecated_int32 = 1 [deprecated=true];
+}
+
+// Define these after TestAllTypes to make sure the compiler can handle
+// that.
+message ForeignMessage {
+  optional int32 c = 1;
+}
+
+enum ForeignEnum {
+  FOREIGN_FOO = 4;
+  FOREIGN_BAR = 5;
+  FOREIGN_BAZ = 6;
+}
+
+message TestAllExtensions {
+  extensions 1 to max;
+}
+
+extend TestAllExtensions {
+  // Singular
+  optional    int32 optional_int32_extension    =  1;
+  optional    int64 optional_int64_extension    =  2;
+  optional   uint32 optional_uint32_extension   =  3;
+  optional   uint64 optional_uint64_extension   =  4;
+  optional   sint32 optional_sint32_extension   =  5;
+  optional   sint64 optional_sint64_extension   =  6;
+  optional  fixed32 optional_fixed32_extension  =  7;
+  optional  fixed64 optional_fixed64_extension  =  8;
+  optional sfixed32 optional_sfixed32_extension =  9;
+  optional sfixed64 optional_sfixed64_extension = 10;
+  optional    float optional_float_extension    = 11;
+  optional   double optional_double_extension   = 12;
+  optional     bool optional_bool_extension     = 13;
+  optional   string optional_string_extension   = 14;
+  optional    bytes optional_bytes_extension    = 15;
+
+  optional group OptionalGroup_extension = 16 {
+    optional int32 a = 17;
+  }
+
+  optional TestAllTypes.NestedMessage optional_nested_message_extension = 18;
+  optional ForeignMessage optional_foreign_message_extension = 19;
+  optional protobuf_unittest_import.ImportMessage
+    optional_import_message_extension = 20;
+
+  optional TestAllTypes.NestedEnum optional_nested_enum_extension = 21;
+  optional ForeignEnum optional_foreign_enum_extension = 22;
+  optional protobuf_unittest_import.ImportEnum
+    optional_import_enum_extension = 23;
+
+  optional string optional_string_piece_extension = 24 [ctype=STRING_PIECE];
+  optional string optional_cord_extension = 25 [ctype=CORD];
+
+  optional protobuf_unittest_import.PublicImportMessage
+    optional_public_import_message_extension = 26;
+
+  optional TestAllTypes.NestedMessage
+    optional_lazy_message_extension = 27 [lazy=true];
+
+  // Repeated
+  repeated    int32 repeated_int32_extension    = 31;
+  repeated    int64 repeated_int64_extension    = 32;
+  repeated   uint32 repeated_uint32_extension   = 33;
+  repeated   uint64 repeated_uint64_extension   = 34;
+  repeated   sint32 repeated_sint32_extension   = 35;
+  repeated   sint64 repeated_sint64_extension   = 36;
+  repeated  fixed32 repeated_fixed32_extension  = 37;
+  repeated  fixed64 repeated_fixed64_extension  = 38;
+  repeated sfixed32 repeated_sfixed32_extension = 39;
+  repeated sfixed64 repeated_sfixed64_extension = 40;
+  repeated    float repeated_float_extension    = 41;
+  repeated   double repeated_double_extension   = 42;
+  repeated     bool repeated_bool_extension     = 43;
+  repeated   string repeated_string_extension   = 44;
+  repeated    bytes repeated_bytes_extension    = 45;
+
+  repeated group RepeatedGroup_extension = 46 {
+    optional int32 a = 47;
+  }
+
+  repeated TestAllTypes.NestedMessage repeated_nested_message_extension = 48;
+  repeated ForeignMessage repeated_foreign_message_extension = 49;
+  repeated protobuf_unittest_import.ImportMessage
+    repeated_import_message_extension = 50;
+
+  repeated TestAllTypes.NestedEnum repeated_nested_enum_extension = 51;
+  repeated ForeignEnum repeated_foreign_enum_extension = 52;
+  repeated protobuf_unittest_import.ImportEnum
+    repeated_import_enum_extension = 53;
+
+  repeated string repeated_string_piece_extension = 54 [ctype=STRING_PIECE];
+  repeated string repeated_cord_extension = 55 [ctype=CORD];
+
+  repeated TestAllTypes.NestedMessage
+    repeated_lazy_message_extension = 57 [lazy=true];
+
+  // Singular with defaults
+  optional    int32 default_int32_extension    = 61 [default =  41    ];
+  optional    int64 default_int64_extension    = 62 [default =  42    ];
+  optional   uint32 default_uint32_extension   = 63 [default =  43    ];
+  optional   uint64 default_uint64_extension   = 64 [default =  44    ];
+  optional   sint32 default_sint32_extension   = 65 [default = -45    ];
+  optional   sint64 default_sint64_extension   = 66 [default =  46    ];
+  optional  fixed32 default_fixed32_extension  = 67 [default =  47    ];
+  optional  fixed64 default_fixed64_extension  = 68 [default =  48    ];
+  optional sfixed32 default_sfixed32_extension = 69 [default =  49    ];
+  optional sfixed64 default_sfixed64_extension = 70 [default = -50    ];
+  optional    float default_float_extension    = 71 [default =  51.5  ];
+  optional   double default_double_extension   = 72 [default =  52e3  ];
+  optional     bool default_bool_extension     = 73 [default = true   ];
+  optional   string default_string_extension   = 74 [default = "hello"];
+  optional    bytes default_bytes_extension    = 75 [default = "world"];
+
+  optional TestAllTypes.NestedEnum
+    default_nested_enum_extension = 81 [default = BAR];
+  optional ForeignEnum
+    default_foreign_enum_extension = 82 [default = FOREIGN_BAR];
+  optional protobuf_unittest_import.ImportEnum
+    default_import_enum_extension = 83 [default = IMPORT_BAR];
+
+  optional string default_string_piece_extension = 84 [ctype=STRING_PIECE,
+                                                       default="abc"];
+  optional string default_cord_extension = 85 [ctype=CORD, default="123"];
+}
+
+message TestNestedExtension {
+  extend TestAllExtensions {
+    // Check for bug where string extensions declared in tested scope did not
+    // compile.
+    optional string test = 1002 [default="test"];
+  }
+}
+
+// We have separate messages for testing required fields because it's
+// annoying to have to fill in required fields in TestProto in order to
+// do anything with it.  Note that we don't need to test every type of
+// required filed because the code output is basically identical to
+// optional fields for all types.
+message TestRequired {
+  required int32 a = 1;
+  optional int32 dummy2 = 2;
+  required int32 b = 3;
+
+  extend TestAllExtensions {
+    optional TestRequired single = 1000;
+    repeated TestRequired multi  = 1001;
+  }
+
+  // Pad the field count to 32 so that we can test that IsInitialized()
+  // properly checks multiple elements of has_bits_.
+  optional int32 dummy4  =  4;
+  optional int32 dummy5  =  5;
+  optional int32 dummy6  =  6;
+  optional int32 dummy7  =  7;
+  optional int32 dummy8  =  8;
+  optional int32 dummy9  =  9;
+  optional int32 dummy10 = 10;
+  optional int32 dummy11 = 11;
+  optional int32 dummy12 = 12;
+  optional int32 dummy13 = 13;
+  optional int32 dummy14 = 14;
+  optional int32 dummy15 = 15;
+  optional int32 dummy16 = 16;
+  optional int32 dummy17 = 17;
+  optional int32 dummy18 = 18;
+  optional int32 dummy19 = 19;
+  optional int32 dummy20 = 20;
+  optional int32 dummy21 = 21;
+  optional int32 dummy22 = 22;
+  optional int32 dummy23 = 23;
+  optional int32 dummy24 = 24;
+  optional int32 dummy25 = 25;
+  optional int32 dummy26 = 26;
+  optional int32 dummy27 = 27;
+  optional int32 dummy28 = 28;
+  optional int32 dummy29 = 29;
+  optional int32 dummy30 = 30;
+  optional int32 dummy31 = 31;
+  optional int32 dummy32 = 32;
+
+  required int32 c = 33;
+}
+
+message TestRequiredForeign {
+  optional TestRequired optional_message = 1;
+  repeated TestRequired repeated_message = 2;
+  optional int32 dummy = 3;
+}
+
+// Test that we can use NestedMessage from outside TestAllTypes.
+message TestForeignNested {
+  optional TestAllTypes.NestedMessage foreign_nested = 1;
+}
+
+// TestEmptyMessage is used to test unknown field support.
+message TestEmptyMessage {
+}
+
+// Like above, but declare all field numbers as potential extensions.  No
+// actual extensions should ever be defined for this type.
+message TestEmptyMessageWithExtensions {
+  extensions 1 to max;
+}
+
+message TestMultipleExtensionRanges {
+  extensions 42;
+  extensions 4143 to 4243;
+  extensions 65536 to max;
+}
+
+// Test that really large tag numbers don't break anything.
+message TestReallyLargeTagNumber {
+  // The largest possible tag number is 2^28 - 1, since the wire format uses
+  // three bits to communicate wire type.
+  optional int32 a = 1;
+  optional int32 bb = 268435455;
+}
+
+message TestRecursiveMessage {
+  optional TestRecursiveMessage a = 1;
+  optional int32 i = 2;
+}
+
+// Test that mutual recursion works.
+message TestMutualRecursionA {
+  optional TestMutualRecursionB bb = 1;
+}
+
+message TestMutualRecursionB {
+  optional TestMutualRecursionA a = 1;
+  optional int32 optional_int32 = 2;
+}
+
+// Test that groups have disjoint field numbers from their siblings and
+// parents.  This is NOT possible in proto1; only proto2.  When attempting
+// to compile with proto1, this will emit an error; so we only include it
+// in protobuf_unittest_proto.
+message TestDupFieldNumber {                        // NO_PROTO1
+  optional int32 a = 1;                             // NO_PROTO1
+  optional group Foo = 2 { optional int32 a = 1; }  // NO_PROTO1
+  optional group Bar = 3 { optional int32 a = 1; }  // NO_PROTO1
+}                                                   // NO_PROTO1
+
+// Additional messages for testing lazy fields.
+message TestEagerMessage {
+  optional TestAllTypes sub_message = 1 [lazy=false];
+}
+message TestLazyMessage {
+  optional TestAllTypes sub_message = 1 [lazy=true];
+}
+
+// Needed for a Python test.
+message TestNestedMessageHasBits {
+  message NestedMessage {
+    repeated int32 nestedmessage_repeated_int32 = 1;
+    repeated ForeignMessage nestedmessage_repeated_foreignmessage = 2;
+  }
+  optional NestedMessage optional_nested_message = 1;
+}
+
+
+// Test an enum that has multiple values with the same number.
+enum TestEnumWithDupValue {
+  option allow_alias = true;
+  FOO1 = 1;
+  BAR1 = 2;
+  BAZ = 3;
+  FOO2 = 1;
+  BAR2 = 2;
+}
+
+// Test an enum with large, unordered values.
+enum TestSparseEnum {
+  SPARSE_A = 123;
+  SPARSE_B = 62374;
+  SPARSE_C = 12589234;
+  SPARSE_D = -15;
+  SPARSE_E = -53452;
+  SPARSE_F = 0;
+  SPARSE_G = 2;
+}
+
+// Test message with CamelCase field names.  This violates Protocol Buffer
+// standard style.
+message TestCamelCaseFieldNames {
+  optional int32 PrimitiveField = 1;
+  optional string StringField = 2;
+  optional ForeignEnum EnumField = 3;
+  optional ForeignMessage MessageField = 4;
+  optional string StringPieceField = 5 [ctype=STRING_PIECE];
+  optional string CordField = 6 [ctype=CORD];
+
+  repeated int32 RepeatedPrimitiveField = 7;
+  repeated string RepeatedStringField = 8;
+  repeated ForeignEnum RepeatedEnumField = 9;
+  repeated ForeignMessage RepeatedMessageField = 10;
+  repeated string RepeatedStringPieceField = 11 [ctype=STRING_PIECE];
+  repeated string RepeatedCordField = 12 [ctype=CORD];
+}
+
+
+// We list fields out of order, to ensure that we're using field number and not
+// field index to determine serialization order.
+message TestFieldOrderings {
+  optional string my_string = 11;
+  extensions 2 to 10;
+  optional int64 my_int = 1;
+  extensions 12 to 100;
+  optional float my_float = 101;
+}
+
+
+extend TestFieldOrderings {
+  optional string my_extension_string = 50;
+  optional int32 my_extension_int = 5;
+}
+
+
+message TestExtremeDefaultValues {
+  optional bytes escaped_bytes = 1 [default = "\0\001\a\b\f\n\r\t\v\\\'\"\xfe"];
+  optional uint32 large_uint32 = 2 [default = 0xFFFFFFFF];
+  optional uint64 large_uint64 = 3 [default = 0xFFFFFFFFFFFFFFFF];
+  optional  int32 small_int32  = 4 [default = -0x7FFFFFFF];
+  optional  int64 small_int64  = 5 [default = -0x7FFFFFFFFFFFFFFF];
+  optional  int32 really_small_int32 = 21 [default = -0x80000000];
+  optional  int64 really_small_int64 = 22 [default = -0x8000000000000000];
+
+  // The default value here is UTF-8 for "\u1234".  (We could also just type
+  // the UTF-8 text directly into this text file rather than escape it, but
+  // lots of people use editors that would be confused by this.)
+  optional string utf8_string = 6 [default = "\341\210\264"];
+
+  // Tests for single-precision floating-point values.
+  optional float zero_float = 7 [default = 0];
+  optional float one_float = 8 [default = 1];
+  optional float small_float = 9 [default = 1.5];
+  optional float negative_one_float = 10 [default = -1];
+  optional float negative_float = 11 [default = -1.5];
+  // Using exponents
+  optional float large_float = 12 [default = 2E8];
+  optional float small_negative_float = 13 [default = -8e-28];
+
+  // Text for nonfinite floating-point values.
+  optional double inf_double = 14 [default = inf];
+  optional double neg_inf_double = 15 [default = -inf];
+  optional double nan_double = 16 [default = nan];
+  optional float inf_float = 17 [default = inf];
+  optional float neg_inf_float = 18 [default = -inf];
+  optional float nan_float = 19 [default = nan];
+
+  // Tests for C++ trigraphs.
+  // Trigraphs should be escaped in C++ generated files, but they should not be
+  // escaped for other languages.
+  // Note that in .proto file, "\?" is a valid way to escape ? in string
+  // literals.
+  optional string cpp_trigraph = 20 [default = "? \? ?? \?? \??? ??/ ?\?-"];
+
+  // String defaults containing the character '\000'
+  optional string string_with_zero       = 23 [default = "hel\000lo"];
+  optional  bytes bytes_with_zero        = 24 [default = "wor\000ld"];
+  optional string string_piece_with_zero = 25 [ctype=STRING_PIECE,
+                                               default="ab\000c"];
+  optional string cord_with_zero         = 26 [ctype=CORD,
+                                               default="12\0003"];
+}
+
+message SparseEnumMessage {
+  optional TestSparseEnum sparse_enum = 1;
+}
+
+// Test String and Bytes: string is for valid UTF-8 strings
+message OneString {
+  optional string data = 1;
+}
+
+message MoreString {
+  repeated string data = 1;
+}
+
+message OneBytes {
+  optional bytes data = 1;
+}
+
+message MoreBytes {
+  repeated bytes data = 1;
+}
+
+
+// Test messages for packed fields
+
+message TestPackedTypes {
+  repeated    int32 packed_int32    =  90 [packed = true];
+  repeated    int64 packed_int64    =  91 [packed = true];
+  repeated   uint32 packed_uint32   =  92 [packed = true];
+  repeated   uint64 packed_uint64   =  93 [packed = true];
+  repeated   sint32 packed_sint32   =  94 [packed = true];
+  repeated   sint64 packed_sint64   =  95 [packed = true];
+  repeated  fixed32 packed_fixed32  =  96 [packed = true];
+  repeated  fixed64 packed_fixed64  =  97 [packed = true];
+  repeated sfixed32 packed_sfixed32 =  98 [packed = true];
+  repeated sfixed64 packed_sfixed64 =  99 [packed = true];
+  repeated    float packed_float    = 100 [packed = true];
+  repeated   double packed_double   = 101 [packed = true];
+  repeated     bool packed_bool     = 102 [packed = true];
+  repeated ForeignEnum packed_enum  = 103 [packed = true];
+}
+
+// A message with the same fields as TestPackedTypes, but without packing. Used
+// to test packed <-> unpacked wire compatibility.
+message TestUnpackedTypes {
+  repeated    int32 unpacked_int32    =  90 [packed = false];
+  repeated    int64 unpacked_int64    =  91 [packed = false];
+  repeated   uint32 unpacked_uint32   =  92 [packed = false];
+  repeated   uint64 unpacked_uint64   =  93 [packed = false];
+  repeated   sint32 unpacked_sint32   =  94 [packed = false];
+  repeated   sint64 unpacked_sint64   =  95 [packed = false];
+  repeated  fixed32 unpacked_fixed32  =  96 [packed = false];
+  repeated  fixed64 unpacked_fixed64  =  97 [packed = false];
+  repeated sfixed32 unpacked_sfixed32 =  98 [packed = false];
+  repeated sfixed64 unpacked_sfixed64 =  99 [packed = false];
+  repeated    float unpacked_float    = 100 [packed = false];
+  repeated   double unpacked_double   = 101 [packed = false];
+  repeated     bool unpacked_bool     = 102 [packed = false];
+  repeated ForeignEnum unpacked_enum  = 103 [packed = false];
+}
+
+message TestPackedExtensions {
+  extensions 1 to max;
+}
+
+extend TestPackedExtensions {
+  repeated    int32 packed_int32_extension    =  90 [packed = true];
+  repeated    int64 packed_int64_extension    =  91 [packed = true];
+  repeated   uint32 packed_uint32_extension   =  92 [packed = true];
+  repeated   uint64 packed_uint64_extension   =  93 [packed = true];
+  repeated   sint32 packed_sint32_extension   =  94 [packed = true];
+  repeated   sint64 packed_sint64_extension   =  95 [packed = true];
+  repeated  fixed32 packed_fixed32_extension  =  96 [packed = true];
+  repeated  fixed64 packed_fixed64_extension  =  97 [packed = true];
+  repeated sfixed32 packed_sfixed32_extension =  98 [packed = true];
+  repeated sfixed64 packed_sfixed64_extension =  99 [packed = true];
+  repeated    float packed_float_extension    = 100 [packed = true];
+  repeated   double packed_double_extension   = 101 [packed = true];
+  repeated     bool packed_bool_extension     = 102 [packed = true];
+  repeated ForeignEnum packed_enum_extension  = 103 [packed = true];
+}
+
+// Used by ExtensionSetTest/DynamicExtensions.  The test actually builds
+// a set of extensions to TestAllExtensions dynamically, based on the fields
+// of this message type.
+message TestDynamicExtensions {
+  enum DynamicEnumType {
+    DYNAMIC_FOO = 2200;
+    DYNAMIC_BAR = 2201;
+    DYNAMIC_BAZ = 2202;
+  }
+  message DynamicMessageType {
+    optional int32 dynamic_field = 2100;
+  }
+
+  optional fixed32 scalar_extension = 2000;
+  optional ForeignEnum enum_extension = 2001;
+  optional DynamicEnumType dynamic_enum_extension = 2002;
+
+  optional ForeignMessage message_extension = 2003;
+  optional DynamicMessageType dynamic_message_extension = 2004;
+
+  repeated string repeated_extension = 2005;
+  repeated sint32 packed_extension = 2006 [packed = true];
+}
+
+message TestRepeatedScalarDifferentTagSizes {
+  // Parsing repeated fixed size values used to fail. This message needs to be
+  // used in order to get a tag of the right size; all of the repeated fields
+  // in TestAllTypes didn't trigger the check.
+  repeated fixed32 repeated_fixed32 = 12;
+  // Check for a varint type, just for good measure.
+  repeated int32   repeated_int32   = 13;
+
+  // These have two-byte tags.
+  repeated fixed64 repeated_fixed64 = 2046;
+  repeated int64   repeated_int64   = 2047;
+
+  // Three byte tags.
+  repeated float   repeated_float   = 262142;
+  repeated uint64  repeated_uint64  = 262143;
+}
+
+// Test that if an optional or required message/group field appears multiple
+// times in the input, they need to be merged.
+message TestParsingMerge {
+  // RepeatedFieldsGenerator defines matching field types as TestParsingMerge,
+  // except that all fields are repeated. In the tests, we will serialize the
+  // RepeatedFieldsGenerator to bytes, and parse the bytes to TestParsingMerge.
+  // Repeated fields in RepeatedFieldsGenerator are expected to be merged into
+  // the corresponding required/optional fields in TestParsingMerge.
+  message RepeatedFieldsGenerator {
+    repeated TestAllTypes field1 = 1;
+    repeated TestAllTypes field2 = 2;
+    repeated TestAllTypes field3 = 3;
+    repeated group Group1 = 10 {
+      optional TestAllTypes field1 = 11;
+    }
+    repeated group Group2 = 20 {
+      optional TestAllTypes field1 = 21;
+    }
+    repeated TestAllTypes ext1 = 1000;
+    repeated TestAllTypes ext2 = 1001;
+  }
+  required TestAllTypes required_all_types = 1;
+  optional TestAllTypes optional_all_types = 2;
+  repeated TestAllTypes repeated_all_types = 3;
+  optional group OptionalGroup = 10 {
+    optional TestAllTypes optional_group_all_types = 11;
+  }
+  repeated group RepeatedGroup = 20 {
+    optional TestAllTypes repeated_group_all_types = 21;
+  }
+  extensions 1000 to max;
+  extend TestParsingMerge {
+    optional TestAllTypes optional_ext = 1000;
+    repeated TestAllTypes repeated_ext = 1001;
+  }
+}
+
+message TestCommentInjectionMessage {
+  // */ <- This should not close the generated doc comment
+  optional string a = 1 [default="*/ <- Neither should this."];
+}
+
+
+// Test that RPC services work.
+message FooRequest  {}
+message FooResponse {}
+
+message FooClientMessage {}
+message FooServerMessage{}
+
+service TestService {
+  rpc Foo(FooRequest) returns (FooResponse);
+  rpc Bar(BarRequest) returns (BarResponse);
+}
+
+
+message BarRequest  {}
+message BarResponse {}
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_custom_options.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_custom_options.proto
new file mode 100644
index 0000000..e591d29
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_custom_options.proto
@@ -0,0 +1,387 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: benjy@google.com (Benjy Weinberger)
+//  Based on original Protocol Buffers design by
+//  Sanjay Ghemawat, Jeff Dean, and others.
+//
+// A proto file used to test the "custom options" feature of proto2.
+
+
+// Some generic_services option(s) added automatically.
+// See:  http://go/proto2-generic-services-default
+option cc_generic_services = true;     // auto-added
+option java_generic_services = true;   // auto-added
+option py_generic_services = true;
+
+// A custom file option (defined below).
+option (file_opt1) = 9876543210;
+
+import "google/protobuf/descriptor.proto";
+
+// We don't put this in a package within proto2 because we need to make sure
+// that the generated code doesn't depend on being in the proto2 namespace.
+package protobuf_unittest;
+
+
+// Some simple test custom options of various types.
+
+extend google.protobuf.FileOptions {
+  optional uint64 file_opt1 = 7736974;
+}
+
+extend google.protobuf.MessageOptions {
+  optional int32 message_opt1 = 7739036;
+}
+
+extend google.protobuf.FieldOptions {
+  optional fixed64 field_opt1 = 7740936;
+  // This is useful for testing that we correctly register default values for
+  // extension options.
+  optional int32 field_opt2 = 7753913 [default=42];
+}
+
+extend google.protobuf.EnumOptions {
+  optional sfixed32 enum_opt1 = 7753576;
+}
+
+extend google.protobuf.EnumValueOptions {
+  optional int32 enum_value_opt1 = 1560678;
+}
+
+extend google.protobuf.ServiceOptions {
+  optional sint64 service_opt1 = 7887650;
+}
+
+enum MethodOpt1 {
+  METHODOPT1_VAL1 = 1;
+  METHODOPT1_VAL2 = 2;
+}
+
+extend google.protobuf.MethodOptions {
+  optional MethodOpt1 method_opt1 = 7890860;
+}
+
+// A test message with custom options at all possible locations (and also some
+// regular options, to make sure they interact nicely).
+message TestMessageWithCustomOptions {
+  option message_set_wire_format = false;
+
+  option (message_opt1) = -56;
+
+  optional string field1 = 1 [ctype=CORD,
+                              (field_opt1)=8765432109];
+
+  enum AnEnum {
+    option (enum_opt1) = -789;
+
+    ANENUM_VAL1 = 1;
+    ANENUM_VAL2 = 2 [(enum_value_opt1) = 123];
+  }
+}
+
+
+// A test RPC service with custom options at all possible locations (and also
+// some regular options, to make sure they interact nicely).
+message CustomOptionFooRequest {
+}
+
+message CustomOptionFooResponse {
+}
+
+message CustomOptionFooClientMessage {
+}
+
+message CustomOptionFooServerMessage {
+}
+
+service TestServiceWithCustomOptions {
+  option (service_opt1) = -9876543210;
+
+  rpc Foo(CustomOptionFooRequest) returns (CustomOptionFooResponse) {
+    option (method_opt1) = METHODOPT1_VAL2;
+  }
+}
+
+
+
+// Options of every possible field type, so we can test them all exhaustively.
+
+message DummyMessageContainingEnum {
+  enum TestEnumType {
+    TEST_OPTION_ENUM_TYPE1 = 22;
+    TEST_OPTION_ENUM_TYPE2 = -23;
+  }
+}
+
+message DummyMessageInvalidAsOptionType {
+}
+
+extend google.protobuf.MessageOptions {
+  optional         bool     bool_opt = 7706090;
+  optional        int32    int32_opt = 7705709;
+  optional        int64    int64_opt = 7705542;
+  optional       uint32   uint32_opt = 7704880;
+  optional       uint64   uint64_opt = 7702367;
+  optional       sint32   sint32_opt = 7701568;
+  optional       sint64   sint64_opt = 7700863;
+  optional      fixed32  fixed32_opt = 7700307;
+  optional      fixed64  fixed64_opt = 7700194;
+  optional     sfixed32 sfixed32_opt = 7698645;
+  optional     sfixed64 sfixed64_opt = 7685475;
+  optional        float    float_opt = 7675390;
+  optional       double   double_opt = 7673293;
+  optional       string   string_opt = 7673285;
+  optional        bytes    bytes_opt = 7673238;
+  optional DummyMessageContainingEnum.TestEnumType enum_opt = 7673233;
+  optional DummyMessageInvalidAsOptionType message_type_opt = 7665967;
+}
+
+message CustomOptionMinIntegerValues {
+  option     (bool_opt) = false;
+  option    (int32_opt) = -0x80000000;
+  option    (int64_opt) = -0x8000000000000000;
+  option   (uint32_opt) = 0;
+  option   (uint64_opt) = 0;
+  option   (sint32_opt) = -0x80000000;
+  option   (sint64_opt) = -0x8000000000000000;
+  option  (fixed32_opt) = 0;
+  option  (fixed64_opt) = 0;
+  option (sfixed32_opt) = -0x80000000;
+  option (sfixed64_opt) = -0x8000000000000000;
+}
+
+message CustomOptionMaxIntegerValues {
+  option     (bool_opt) = true;
+  option    (int32_opt) = 0x7FFFFFFF;
+  option    (int64_opt) = 0x7FFFFFFFFFFFFFFF;
+  option   (uint32_opt) = 0xFFFFFFFF;
+  option   (uint64_opt) = 0xFFFFFFFFFFFFFFFF;
+  option   (sint32_opt) = 0x7FFFFFFF;
+  option   (sint64_opt) = 0x7FFFFFFFFFFFFFFF;
+  option  (fixed32_opt) = 0xFFFFFFFF;
+  option  (fixed64_opt) = 0xFFFFFFFFFFFFFFFF;
+  option (sfixed32_opt) = 0x7FFFFFFF;
+  option (sfixed64_opt) = 0x7FFFFFFFFFFFFFFF;
+}
+
+message CustomOptionOtherValues {
+  option  (int32_opt) = -100;  // To test sign-extension.
+  option  (float_opt) = 12.3456789;
+  option (double_opt) = 1.234567890123456789;
+  option (string_opt) = "Hello, \"World\"";
+  option  (bytes_opt) = "Hello\0World";
+  option   (enum_opt) = TEST_OPTION_ENUM_TYPE2;
+}
+
+message SettingRealsFromPositiveInts {
+  option  (float_opt) = 12;
+  option (double_opt) = 154;
+}
+
+message SettingRealsFromNegativeInts {
+  option  (float_opt) = -12;
+  option  (double_opt) = -154;
+}
+
+// Options of complex message types, themselves combined and extended in
+// various ways.
+
+message ComplexOptionType1 {
+  optional int32 foo = 1;
+  optional int32 foo2 = 2;
+  optional int32 foo3 = 3;
+
+  extensions 100 to max;
+}
+
+message ComplexOptionType2 {
+  optional ComplexOptionType1 bar = 1;
+  optional int32 baz = 2;
+
+  message ComplexOptionType4 {
+    optional int32 waldo = 1;
+
+    extend google.protobuf.MessageOptions {
+      optional ComplexOptionType4 complex_opt4 = 7633546;
+    }
+  }
+
+  optional ComplexOptionType4 fred = 3;
+
+  extensions 100 to max;
+}
+
+message ComplexOptionType3 {
+  optional int32 qux = 1;
+
+  optional group ComplexOptionType5 = 2 {
+    optional int32 plugh = 3;
+  }
+}
+
+extend ComplexOptionType1 {
+  optional int32 quux = 7663707;
+  optional ComplexOptionType3 corge = 7663442;
+}
+
+extend ComplexOptionType2 {
+  optional int32 grault = 7650927;
+  optional ComplexOptionType1 garply = 7649992;
+}
+
+extend google.protobuf.MessageOptions {
+  optional protobuf_unittest.ComplexOptionType1 complex_opt1 = 7646756;
+  optional ComplexOptionType2 complex_opt2 = 7636949;
+  optional ComplexOptionType3 complex_opt3 = 7636463;
+  optional group ComplexOpt6 = 7595468 {
+    optional int32 xyzzy = 7593951;
+  }
+}
+
+// Note that we try various different ways of naming the same extension.
+message VariousComplexOptions {
+  option (.protobuf_unittest.complex_opt1).foo = 42;
+  option (protobuf_unittest.complex_opt1).(.protobuf_unittest.quux) = 324;
+  option (.protobuf_unittest.complex_opt1).(protobuf_unittest.corge).qux = 876;
+  option (complex_opt2).baz = 987;
+  option (complex_opt2).(grault) = 654;
+  option (complex_opt2).bar.foo = 743;
+  option (complex_opt2).bar.(quux) = 1999;
+  option (complex_opt2).bar.(protobuf_unittest.corge).qux = 2008;
+  option (complex_opt2).(garply).foo = 741;
+  option (complex_opt2).(garply).(.protobuf_unittest.quux) = 1998;
+  option (complex_opt2).(protobuf_unittest.garply).(corge).qux = 2121;
+  option (ComplexOptionType2.ComplexOptionType4.complex_opt4).waldo = 1971;
+  option (complex_opt2).fred.waldo = 321;
+  option (protobuf_unittest.complex_opt3).qux = 9;
+  option (complex_opt3).complexoptiontype5.plugh = 22;
+  option (complexopt6).xyzzy = 24;
+}
+
+// ------------------------------------------------------
+// Definitions for testing aggregate option parsing.
+// See descriptor_unittest.cc.
+
+message AggregateMessageSet {
+  option message_set_wire_format = true;
+  extensions 4 to max;
+}
+
+message AggregateMessageSetElement {
+  extend AggregateMessageSet {
+    optional AggregateMessageSetElement message_set_extension = 15447542;
+  }
+  optional string s = 1;
+}
+
+// A helper type used to test aggregate option parsing
+message Aggregate {
+  optional int32 i = 1;
+  optional string s = 2;
+
+  // A nested object
+  optional Aggregate sub = 3;
+
+  // To test the parsing of extensions inside aggregate values
+  optional google.protobuf.FileOptions file = 4;
+  extend google.protobuf.FileOptions {
+    optional Aggregate nested = 15476903;
+  }
+
+  // An embedded message set
+  optional AggregateMessageSet mset = 5;
+}
+
+// Allow Aggregate to be used as an option at all possible locations
+// in the .proto grammer.
+extend google.protobuf.FileOptions      { optional Aggregate fileopt    = 15478479; }
+extend google.protobuf.MessageOptions   { optional Aggregate msgopt     = 15480088; }
+extend google.protobuf.FieldOptions     { optional Aggregate fieldopt   = 15481374; }
+extend google.protobuf.EnumOptions      { optional Aggregate enumopt    = 15483218; }
+extend google.protobuf.EnumValueOptions { optional Aggregate enumvalopt = 15486921; }
+extend google.protobuf.ServiceOptions   { optional Aggregate serviceopt = 15497145; }
+extend google.protobuf.MethodOptions    { optional Aggregate methodopt  = 15512713; }
+
+// Try using AggregateOption at different points in the proto grammar
+option (fileopt) = {
+  s: 'FileAnnotation'
+  // Also test the handling of comments
+  /* of both types */ i: 100
+
+  sub { s: 'NestedFileAnnotation' }
+
+  // Include a google.protobuf.FileOptions and recursively extend it with
+  // another fileopt.
+  file {
+    [protobuf_unittest.fileopt] {
+      s:'FileExtensionAnnotation'
+    }
+  }
+
+  // A message set inside an option value
+  mset {
+    [protobuf_unittest.AggregateMessageSetElement.message_set_extension] {
+      s: 'EmbeddedMessageSetElement'
+    }
+  }
+};
+
+message AggregateMessage {
+  option (msgopt) = { i:101 s:'MessageAnnotation' };
+  optional int32 fieldname = 1 [(fieldopt) = { s:'FieldAnnotation' }];
+}
+
+service AggregateService {
+  option (serviceopt) = { s:'ServiceAnnotation' };
+  rpc Method (AggregateMessage) returns (AggregateMessage) {
+    option (methodopt) = { s:'MethodAnnotation' };
+  }
+}
+
+enum AggregateEnum {
+  option (enumopt) = { s:'EnumAnnotation' };
+  VALUE = 1 [(enumvalopt) = { s:'EnumValueAnnotation' }];
+}
+
+// Test custom options for nested type.
+message NestedOptionType {
+  message NestedMessage {
+    option (message_opt1) = 1001;
+    optional int32 nested_field = 1 [(field_opt1) = 1002];
+  }
+  enum NestedEnum {
+    option (enum_opt1) = 1003;
+    NESTED_ENUM_VALUE = 1 [(enum_value_opt1) = 1004];
+  }
+  extend google.protobuf.FileOptions {
+    optional int32 nested_extension = 7912573 [(field_opt2) = 1005];
+  }
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import.proto
new file mode 100644
index 0000000..c115b11
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import.proto
@@ -0,0 +1,64 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+//  Based on original Protocol Buffers design by
+//  Sanjay Ghemawat, Jeff Dean, and others.
+//
+// A proto file which is imported by unittest.proto to test importing.
+
+
+// We don't put this in a package within proto2 because we need to make sure
+// that the generated code doesn't depend on being in the proto2 namespace.
+// In test_util.h we do
+// "using namespace unittest_import = protobuf_unittest_import".
+package protobuf_unittest_import;
+
+option optimize_for = SPEED;
+
+// Excercise the java_package option.
+option java_package = "com.google.protobuf.test";
+
+// Do not set a java_outer_classname here to verify that Proto2 works without
+// one.
+
+// Test public import
+import public "google/protobuf/unittest_import_public.proto";
+
+message ImportMessage {
+  optional int32 d = 1;
+}
+
+enum ImportEnum {
+  IMPORT_FOO = 7;
+  IMPORT_BAR = 8;
+  IMPORT_BAZ = 9;
+}
+
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import_public.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import_public.proto
new file mode 100644
index 0000000..ea5d1b1
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_import_public.proto
@@ -0,0 +1,40 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: liujisi@google.com (Pherl Liu)
+
+
+package protobuf_unittest_import;
+
+option java_package = "com.google.protobuf.test";
+
+message PublicImportMessage {
+  optional int32 e = 1;
+}
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_mset.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_mset.proto
new file mode 100644
index 0000000..3497f09
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_mset.proto
@@ -0,0 +1,72 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+//  Based on original Protocol Buffers design by
+//  Sanjay Ghemawat, Jeff Dean, and others.
+//
+// This file contains messages for testing message_set_wire_format.
+
+package protobuf_unittest;
+
+option optimize_for = SPEED;
+
+// A message with message_set_wire_format.
+message TestMessageSet {
+  option message_set_wire_format = true;
+  extensions 4 to max;
+}
+
+message TestMessageSetContainer {
+  optional TestMessageSet message_set = 1;
+}
+
+message TestMessageSetExtension1 {
+  extend TestMessageSet {
+    optional TestMessageSetExtension1 message_set_extension = 1545008;
+  }
+  optional int32 i = 15;
+}
+
+message TestMessageSetExtension2 {
+  extend TestMessageSet {
+    optional TestMessageSetExtension2 message_set_extension = 1547769;
+  }
+  optional string str = 25;
+}
+
+// MessageSet wire format is equivalent to this.
+message RawMessageSet {
+  repeated group Item = 1 {
+    required int32 type_id = 2;
+    required bytes message = 3;
+  }
+}
+
diff --git a/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_no_generic_services.proto b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_no_generic_services.proto
new file mode 100644
index 0000000..cffb412
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/protos/src/proto/google/protobuf/unittest_no_generic_services.proto
@@ -0,0 +1,52 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// http://code.google.com/p/protobuf/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: kenton@google.com (Kenton Varda)
+
+package google.protobuf.no_generic_services_test;
+
+// *_generic_services are false by default.
+
+message TestMessage {
+  optional int32 a = 1;
+  extensions 1000 to max;
+}
+
+enum TestEnum {
+  FOO = 1;
+}
+
+extend TestMessage {
+  optional int32 test_extension = 1000;
+}
+
+service TestService {
+  rpc Foo(TestMessage) returns(TestMessage);
+}
diff --git a/python/compatibility_tests/v2.5.0/setup.py b/python/compatibility_tests/v2.5.0/setup.py
new file mode 100755
index 0000000..b41d54d
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/setup.py
@@ -0,0 +1,87 @@
+#! /usr/bin/env python
+#
+import glob
+import os
+import subprocess
+import sys
+
+from setuptools import setup, Extension, find_packages
+
+if sys.version_info[0] == 3:
+  # Python 3
+  from distutils.command.build_py import build_py_2to3 as _build_py
+else:
+  # Python 2
+  from distutils.command.build_py import build_py as _build_py
+from distutils.spawn import find_executable
+
+def generate_proto(source, code_gen):
+  """Invokes the Protocol Compiler to generate a _pb2.py from the given
+  .proto file."""
+  output = source.replace(".proto", "_pb2.py").replace("protos/src/proto/", "").replace("protos/python/", "")
+
+  if not os.path.exists(source):
+    sys.stderr.write("Can't find required file: %s\n" % source)
+    sys.exit(-1)
+
+  protoc_command = [ code_gen, "-Iprotos/src/proto", "-Iprotos/python", "--python_out=.", source ]
+  if subprocess.call(protoc_command) != 0:
+    sys.exit(-1)
+
+class build_py(_build_py):
+  def run(self):
+    # generate .proto file
+    protoc_1 = "./protoc_1"
+    protoc_2 = "./protoc_2"
+    generate_proto("protos/src/proto/google/protobuf/unittest.proto", protoc_2)
+    generate_proto("protos/src/proto/google/protobuf/unittest_custom_options.proto", protoc_1)
+    generate_proto("protos/src/proto/google/protobuf/unittest_import.proto", protoc_1)
+    generate_proto("protos/src/proto/google/protobuf/unittest_import_public.proto", protoc_1)
+    generate_proto("protos/src/proto/google/protobuf/unittest_mset.proto", protoc_1)
+    generate_proto("protos/src/proto/google/protobuf/unittest_no_generic_services.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/factory_test1.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/factory_test2.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/more_extensions.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/more_extensions_dynamic.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/more_messages.proto", protoc_1)
+    generate_proto("protos/python/google/protobuf/internal/test_bad_identifiers.proto", protoc_1)
+
+    # _build_py is an old-style class, so super() doesn't work.
+    _build_py.run(self)
+
+if __name__ == '__main__':
+  # Keep this list of dependencies in sync with tox.ini.
+  install_requires = ['six>=1.9', 'setuptools']
+  if sys.version_info <= (2,7):
+    install_requires.append('ordereddict')
+    install_requires.append('unittest2')
+
+  setup(
+      name='protobuf',
+      description='Protocol Buffers',
+      download_url='https://github.com/google/protobuf/releases',
+      long_description="Protocol Buffers are Google's data interchange format",
+      url='https://developers.google.com/protocol-buffers/',
+      maintainer='protobuf@googlegroups.com',
+      maintainer_email='protobuf@googlegroups.com',
+      license='3-Clause BSD License',
+      classifiers=[
+        "Programming Language :: Python",
+        "Programming Language :: Python :: 2",
+        "Programming Language :: Python :: 2.6",
+        "Programming Language :: Python :: 2.7",
+        "Programming Language :: Python :: 3",
+        "Programming Language :: Python :: 3.3",
+        "Programming Language :: Python :: 3.4",
+        ],
+      packages=find_packages(
+          exclude=[
+              'import_test_package',
+          ],
+      ),
+      test_suite='tests.google.protobuf.internal',
+      cmdclass={
+          'build_py': build_py,
+      },
+      install_requires=install_requires,
+  )
diff --git a/python/compatibility_tests/v2.5.0/test.sh b/python/compatibility_tests/v2.5.0/test.sh
new file mode 100755
index 0000000..78c16ad
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/test.sh
@@ -0,0 +1,104 @@
+#!/bin/bash
+
+set -ex
+
+# Change to the script's directory.
+cd $(dirname $0)
+
+# Version of the tests (i.e., the version of protobuf from where we extracted
+# these tests).
+TEST_VERSION=2.5.0
+
+# The old version of protobuf that we are testing compatibility against. This
+# is usually the same as TEST_VERSION (i.e., we use the tests extracted from
+# that version to test compatibility of the newest runtime against it), but it
+# is also possible to use this same test set to test the compatibiilty of the
+# latest version against other versions.
+case "$1" in
+  ""|2.5.0)
+    OLD_VERSION=2.5.0
+    OLD_VERSION_PROTOC=https://github.com/xfxyjwf/protobuf-compiler-release/raw/master/v2.5.0/linux/protoc
+    ;;
+  2.6.1)
+    OLD_VERSION=2.6.1
+    OLD_VERSION_PROTOC=http://repo1.maven.org/maven2/com/google/protobuf/protoc/2.6.1-build2/protoc-2.6.1-build2-linux-x86_64.exe
+    ;;
+  3.0.0-beta-1)
+    OLD_VERSION=3.0.0-beta-1
+    OLD_VERSION_PROTOC=http://repo1.maven.org/maven2/com/google/protobuf/protoc/3.0.0-beta-1/protoc-3.0.0-beta-1-linux-x86_64.exe
+    ;;
+  3.0.0-beta-2)
+    OLD_VERSION=3.0.0-beta-2
+    OLD_VERSION_PROTOC=http://repo1.maven.org/maven2/com/google/protobuf/protoc/3.0.0-beta-2/protoc-3.0.0-beta-2-linux-x86_64.exe
+    ;;
+  3.0.0-beta-3)
+    OLD_VERSION=3.0.0-beta-3
+    OLD_VERSION_PROTOC=http://repo1.maven.org/maven2/com/google/protobuf/protoc/3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.exe
+    ;;
+  3.0.0-beta-4)
+    OLD_VERSION=3.0.0-beta-4
+    OLD_VERSION_PROTOC=http://repo1.maven.org/maven2/com/google/protobuf/protoc/3.0.0-beta-4/protoc-3.0.0-beta-4-linux-x86_64.exe
+    ;;
+  *)
+    echo "[ERROR]: Unknown version number: $1"
+    exit 1
+    ;;
+esac
+
+# Extract the latest protobuf version number.
+VERSION_NUMBER=`grep "^__version__ = '.*'" ../../google/protobuf/__init__.py | sed "s|__version__ = '\(.*\)'|\1|"`
+
+echo "Running compatibility tests between $VERSION_NUMBER and $OLD_VERSION"
+
+# Check protoc
+[ -f ../../../src/protoc ] || {
+  echo "[ERROR]: Please build protoc first."
+  exit 1
+}
+
+# Test source compatibility. In these tests we recompile everything against
+# the new runtime (including old version generated code).
+rm google -f -r
+mkdir -p google/protobuf/internal
+# Build and copy the new runtime
+cd ../../
+python setup.py build
+cp google/protobuf/*.py compatibility_tests/v2.5.0/google/protobuf/
+cp google/protobuf/internal/*.py compatibility_tests/v2.5.0/google/protobuf/internal/
+cd compatibility_tests/v2.5.0
+cp tests/google/protobuf/internal/test_util.py google/protobuf/internal/
+cp google/protobuf/__init__.py google/
+
+# Download old version protoc compiler (for linux)
+wget $OLD_VERSION_PROTOC -O old_protoc
+chmod +x old_protoc
+
+# Test A.1:
+#   proto set 1: use old version
+#   proto set 2 which may import protos in set 1: use old version
+cp old_protoc protoc_1
+cp old_protoc protoc_2
+python setup.py build
+python setup.py test
+
+# Test A.2:
+#   proto set 1: use new version
+#   proto set 2 which may import protos in set 1: use old version
+cp ../../../src/protoc protoc_1
+cp old_protoc protoc_2
+python setup.py build
+python setup.py test
+
+# Test A.3:
+#   proto set 1: use old version
+#   proto set 2 which may import protos in set 1: use new version
+cp old_protoc protoc_1
+cp ../../../src/protoc protoc_2
+python setup.py build
+python setup.py test
+
+rm google -r -f
+rm build -r -f
+rm protoc_1
+rm protoc_2
+rm old_protoc
diff --git a/python/compatibility_tests/v2.5.0/tests/__init__.py b/python/compatibility_tests/v2.5.0/tests/__init__.py
new file mode 100644
index 0000000..5585614
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/__init__.py
@@ -0,0 +1,4 @@
+try:
+  __import__('pkg_resources').declare_namespace(__name__)
+except ImportError:
+  __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/compatibility_tests/v2.5.0/tests/google/__init__.py b/python/compatibility_tests/v2.5.0/tests/google/__init__.py
new file mode 100644
index 0000000..5585614
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/__init__.py
@@ -0,0 +1,4 @@
+try:
+  __import__('pkg_resources').declare_namespace(__name__)
+except ImportError:
+  __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/__init__.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/__init__.py
new file mode 100644
index 0000000..5585614
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/__init__.py
@@ -0,0 +1,4 @@
+try:
+  __import__('pkg_resources').declare_namespace(__name__)
+except ImportError:
+  __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/__init__.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/__init__.py
new file mode 100644
index 0000000..64c6956
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/__init__.py
@@ -0,0 +1,37 @@
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# https://developers.google.com/protocol-buffers/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+# Copyright 2007 Google Inc. All Rights Reserved.
+
+if __name__ != '__main__':
+  try:
+    __import__('pkg_resources').declare_namespace(__name__)
+  except ImportError:
+    __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/descriptor_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/descriptor_test.py
new file mode 100755
index 0000000..c74f882
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/descriptor_test.py
@@ -0,0 +1,613 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Unittest for google.protobuf.internal.descriptor."""
+
+__author__ = 'robinson@google.com (Will Robinson)'
+
+import unittest
+from google.protobuf import unittest_custom_options_pb2
+from google.protobuf import unittest_import_pb2
+from google.protobuf import unittest_pb2
+from google.protobuf import descriptor_pb2
+from google.protobuf import descriptor
+from google.protobuf import text_format
+
+
+TEST_EMPTY_MESSAGE_DESCRIPTOR_ASCII = """
+name: 'TestEmptyMessage'
+"""
+
+
+class DescriptorTest(unittest.TestCase):
+
+  def setUp(self):
+    self.my_file = descriptor.FileDescriptor(
+        name='some/filename/some.proto',
+        package='protobuf_unittest'
+        )
+    self.my_enum = descriptor.EnumDescriptor(
+        name='ForeignEnum',
+        full_name='protobuf_unittest.ForeignEnum',
+        filename=None,
+        file=self.my_file,
+        values=[
+          descriptor.EnumValueDescriptor(name='FOREIGN_FOO', index=0, number=4),
+          descriptor.EnumValueDescriptor(name='FOREIGN_BAR', index=1, number=5),
+          descriptor.EnumValueDescriptor(name='FOREIGN_BAZ', index=2, number=6),
+        ])
+    self.my_message = descriptor.Descriptor(
+        name='NestedMessage',
+        full_name='protobuf_unittest.TestAllTypes.NestedMessage',
+        filename=None,
+        file=self.my_file,
+        containing_type=None,
+        fields=[
+          descriptor.FieldDescriptor(
+            name='bb',
+            full_name='protobuf_unittest.TestAllTypes.NestedMessage.bb',
+            index=0, number=1,
+            type=5, cpp_type=1, label=1,
+            has_default_value=False, default_value=0,
+            message_type=None, enum_type=None, containing_type=None,
+            is_extension=False, extension_scope=None),
+        ],
+        nested_types=[],
+        enum_types=[
+          self.my_enum,
+        ],
+        extensions=[])
+    self.my_method = descriptor.MethodDescriptor(
+        name='Bar',
+        full_name='protobuf_unittest.TestService.Bar',
+        index=0,
+        containing_service=None,
+        input_type=None,
+        output_type=None)
+    self.my_service = descriptor.ServiceDescriptor(
+        name='TestServiceWithOptions',
+        full_name='protobuf_unittest.TestServiceWithOptions',
+        file=self.my_file,
+        index=0,
+        methods=[
+            self.my_method
+        ])
+
+  def testEnumValueName(self):
+    self.assertEqual(self.my_message.EnumValueName('ForeignEnum', 4),
+                     'FOREIGN_FOO')
+
+    self.assertEqual(
+        self.my_message.enum_types_by_name[
+            'ForeignEnum'].values_by_number[4].name,
+        self.my_message.EnumValueName('ForeignEnum', 4))
+
+  def testEnumFixups(self):
+    self.assertEqual(self.my_enum, self.my_enum.values[0].type)
+
+  def testContainingTypeFixups(self):
+    self.assertEqual(self.my_message, self.my_message.fields[0].containing_type)
+    self.assertEqual(self.my_message, self.my_enum.containing_type)
+
+  def testContainingServiceFixups(self):
+    self.assertEqual(self.my_service, self.my_method.containing_service)
+
+  def testGetOptions(self):
+    self.assertEqual(self.my_enum.GetOptions(),
+                     descriptor_pb2.EnumOptions())
+    self.assertEqual(self.my_enum.values[0].GetOptions(),
+                     descriptor_pb2.EnumValueOptions())
+    self.assertEqual(self.my_message.GetOptions(),
+                     descriptor_pb2.MessageOptions())
+    self.assertEqual(self.my_message.fields[0].GetOptions(),
+                     descriptor_pb2.FieldOptions())
+    self.assertEqual(self.my_method.GetOptions(),
+                     descriptor_pb2.MethodOptions())
+    self.assertEqual(self.my_service.GetOptions(),
+                     descriptor_pb2.ServiceOptions())
+
+  def testSimpleCustomOptions(self):
+    file_descriptor = unittest_custom_options_pb2.DESCRIPTOR
+    message_descriptor =\
+        unittest_custom_options_pb2.TestMessageWithCustomOptions.DESCRIPTOR
+    field_descriptor = message_descriptor.fields_by_name["field1"]
+    enum_descriptor = message_descriptor.enum_types_by_name["AnEnum"]
+    enum_value_descriptor =\
+        message_descriptor.enum_values_by_name["ANENUM_VAL2"]
+    service_descriptor =\
+        unittest_custom_options_pb2.TestServiceWithCustomOptions.DESCRIPTOR
+    method_descriptor = service_descriptor.FindMethodByName("Foo")
+
+    file_options = file_descriptor.GetOptions()
+    file_opt1 = unittest_custom_options_pb2.file_opt1
+    self.assertEqual(9876543210, file_options.Extensions[file_opt1])
+    message_options = message_descriptor.GetOptions()
+    message_opt1 = unittest_custom_options_pb2.message_opt1
+    self.assertEqual(-56, message_options.Extensions[message_opt1])
+    field_options = field_descriptor.GetOptions()
+    field_opt1 = unittest_custom_options_pb2.field_opt1
+    self.assertEqual(8765432109, field_options.Extensions[field_opt1])
+    field_opt2 = unittest_custom_options_pb2.field_opt2
+    self.assertEqual(42, field_options.Extensions[field_opt2])
+    enum_options = enum_descriptor.GetOptions()
+    enum_opt1 = unittest_custom_options_pb2.enum_opt1
+    self.assertEqual(-789, enum_options.Extensions[enum_opt1])
+    enum_value_options = enum_value_descriptor.GetOptions()
+    enum_value_opt1 = unittest_custom_options_pb2.enum_value_opt1
+    self.assertEqual(123, enum_value_options.Extensions[enum_value_opt1])
+
+    service_options = service_descriptor.GetOptions()
+    service_opt1 = unittest_custom_options_pb2.service_opt1
+    self.assertEqual(-9876543210, service_options.Extensions[service_opt1])
+    method_options = method_descriptor.GetOptions()
+    method_opt1 = unittest_custom_options_pb2.method_opt1
+    self.assertEqual(unittest_custom_options_pb2.METHODOPT1_VAL2,
+                     method_options.Extensions[method_opt1])
+
+  def testDifferentCustomOptionTypes(self):
+    kint32min = -2**31
+    kint64min = -2**63
+    kint32max = 2**31 - 1
+    kint64max = 2**63 - 1
+    kuint32max = 2**32 - 1
+    kuint64max = 2**64 - 1
+
+    message_descriptor =\
+        unittest_custom_options_pb2.CustomOptionMinIntegerValues.DESCRIPTOR
+    message_options = message_descriptor.GetOptions()
+    self.assertEqual(False, message_options.Extensions[
+        unittest_custom_options_pb2.bool_opt])
+    self.assertEqual(kint32min, message_options.Extensions[
+        unittest_custom_options_pb2.int32_opt])
+    self.assertEqual(kint64min, message_options.Extensions[
+        unittest_custom_options_pb2.int64_opt])
+    self.assertEqual(0, message_options.Extensions[
+        unittest_custom_options_pb2.uint32_opt])
+    self.assertEqual(0, message_options.Extensions[
+        unittest_custom_options_pb2.uint64_opt])
+    self.assertEqual(kint32min, message_options.Extensions[
+        unittest_custom_options_pb2.sint32_opt])
+    self.assertEqual(kint64min, message_options.Extensions[
+        unittest_custom_options_pb2.sint64_opt])
+    self.assertEqual(0, message_options.Extensions[
+        unittest_custom_options_pb2.fixed32_opt])
+    self.assertEqual(0, message_options.Extensions[
+        unittest_custom_options_pb2.fixed64_opt])
+    self.assertEqual(kint32min, message_options.Extensions[
+        unittest_custom_options_pb2.sfixed32_opt])
+    self.assertEqual(kint64min, message_options.Extensions[
+        unittest_custom_options_pb2.sfixed64_opt])
+
+    message_descriptor =\
+        unittest_custom_options_pb2.CustomOptionMaxIntegerValues.DESCRIPTOR
+    message_options = message_descriptor.GetOptions()
+    self.assertEqual(True, message_options.Extensions[
+        unittest_custom_options_pb2.bool_opt])
+    self.assertEqual(kint32max, message_options.Extensions[
+        unittest_custom_options_pb2.int32_opt])
+    self.assertEqual(kint64max, message_options.Extensions[
+        unittest_custom_options_pb2.int64_opt])
+    self.assertEqual(kuint32max, message_options.Extensions[
+        unittest_custom_options_pb2.uint32_opt])
+    self.assertEqual(kuint64max, message_options.Extensions[
+        unittest_custom_options_pb2.uint64_opt])
+    self.assertEqual(kint32max, message_options.Extensions[
+        unittest_custom_options_pb2.sint32_opt])
+    self.assertEqual(kint64max, message_options.Extensions[
+        unittest_custom_options_pb2.sint64_opt])
+    self.assertEqual(kuint32max, message_options.Extensions[
+        unittest_custom_options_pb2.fixed32_opt])
+    self.assertEqual(kuint64max, message_options.Extensions[
+        unittest_custom_options_pb2.fixed64_opt])
+    self.assertEqual(kint32max, message_options.Extensions[
+        unittest_custom_options_pb2.sfixed32_opt])
+    self.assertEqual(kint64max, message_options.Extensions[
+        unittest_custom_options_pb2.sfixed64_opt])
+
+    message_descriptor =\
+        unittest_custom_options_pb2.CustomOptionOtherValues.DESCRIPTOR
+    message_options = message_descriptor.GetOptions()
+    self.assertEqual(-100, message_options.Extensions[
+        unittest_custom_options_pb2.int32_opt])
+    self.assertAlmostEqual(12.3456789, message_options.Extensions[
+        unittest_custom_options_pb2.float_opt], 6)
+    self.assertAlmostEqual(1.234567890123456789, message_options.Extensions[
+        unittest_custom_options_pb2.double_opt])
+    self.assertEqual("Hello, \"World\"", message_options.Extensions[
+        unittest_custom_options_pb2.string_opt])
+    self.assertEqual("Hello\0World", message_options.Extensions[
+        unittest_custom_options_pb2.bytes_opt])
+    dummy_enum = unittest_custom_options_pb2.DummyMessageContainingEnum
+    self.assertEqual(
+        dummy_enum.TEST_OPTION_ENUM_TYPE2,
+        message_options.Extensions[unittest_custom_options_pb2.enum_opt])
+
+    message_descriptor =\
+        unittest_custom_options_pb2.SettingRealsFromPositiveInts.DESCRIPTOR
+    message_options = message_descriptor.GetOptions()
+    self.assertAlmostEqual(12, message_options.Extensions[
+        unittest_custom_options_pb2.float_opt], 6)
+    self.assertAlmostEqual(154, message_options.Extensions[
+        unittest_custom_options_pb2.double_opt])
+
+    message_descriptor =\
+        unittest_custom_options_pb2.SettingRealsFromNegativeInts.DESCRIPTOR
+    message_options = message_descriptor.GetOptions()
+    self.assertAlmostEqual(-12, message_options.Extensions[
+        unittest_custom_options_pb2.float_opt], 6)
+    self.assertAlmostEqual(-154, message_options.Extensions[
+        unittest_custom_options_pb2.double_opt])
+
+  def testComplexExtensionOptions(self):
+    descriptor =\
+        unittest_custom_options_pb2.VariousComplexOptions.DESCRIPTOR
+    options = descriptor.GetOptions()
+    self.assertEqual(42, options.Extensions[
+        unittest_custom_options_pb2.complex_opt1].foo)
+    self.assertEqual(324, options.Extensions[
+        unittest_custom_options_pb2.complex_opt1].Extensions[
+            unittest_custom_options_pb2.quux])
+    self.assertEqual(876, options.Extensions[
+        unittest_custom_options_pb2.complex_opt1].Extensions[
+            unittest_custom_options_pb2.corge].qux)
+    self.assertEqual(987, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].baz)
+    self.assertEqual(654, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].Extensions[
+            unittest_custom_options_pb2.grault])
+    self.assertEqual(743, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].bar.foo)
+    self.assertEqual(1999, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].bar.Extensions[
+            unittest_custom_options_pb2.quux])
+    self.assertEqual(2008, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].bar.Extensions[
+            unittest_custom_options_pb2.corge].qux)
+    self.assertEqual(741, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].Extensions[
+            unittest_custom_options_pb2.garply].foo)
+    self.assertEqual(1998, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].Extensions[
+            unittest_custom_options_pb2.garply].Extensions[
+                unittest_custom_options_pb2.quux])
+    self.assertEqual(2121, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].Extensions[
+            unittest_custom_options_pb2.garply].Extensions[
+                unittest_custom_options_pb2.corge].qux)
+    self.assertEqual(1971, options.Extensions[
+        unittest_custom_options_pb2.ComplexOptionType2
+        .ComplexOptionType4.complex_opt4].waldo)
+    self.assertEqual(321, options.Extensions[
+        unittest_custom_options_pb2.complex_opt2].fred.waldo)
+    self.assertEqual(9, options.Extensions[
+        unittest_custom_options_pb2.complex_opt3].qux)
+    self.assertEqual(22, options.Extensions[
+        unittest_custom_options_pb2.complex_opt3].complexoptiontype5.plugh)
+    self.assertEqual(24, options.Extensions[
+        unittest_custom_options_pb2.complexopt6].xyzzy)
+
+  # Check that aggregate options were parsed and saved correctly in
+  # the appropriate descriptors.
+  def testAggregateOptions(self):
+    file_descriptor = unittest_custom_options_pb2.DESCRIPTOR
+    message_descriptor =\
+        unittest_custom_options_pb2.AggregateMessage.DESCRIPTOR
+    field_descriptor = message_descriptor.fields_by_name["fieldname"]
+    enum_descriptor = unittest_custom_options_pb2.AggregateEnum.DESCRIPTOR
+    enum_value_descriptor = enum_descriptor.values_by_name["VALUE"]
+    service_descriptor =\
+        unittest_custom_options_pb2.AggregateService.DESCRIPTOR
+    method_descriptor = service_descriptor.FindMethodByName("Method")
+
+    # Tests for the different types of data embedded in fileopt
+    file_options = file_descriptor.GetOptions().Extensions[
+        unittest_custom_options_pb2.fileopt]
+    self.assertEqual(100, file_options.i)
+    self.assertEqual("FileAnnotation", file_options.s)
+    self.assertEqual("NestedFileAnnotation", file_options.sub.s)
+    self.assertEqual("FileExtensionAnnotation", file_options.file.Extensions[
+        unittest_custom_options_pb2.fileopt].s)
+    self.assertEqual("EmbeddedMessageSetElement", file_options.mset.Extensions[
+        unittest_custom_options_pb2.AggregateMessageSetElement
+        .message_set_extension].s)
+
+    # Simple tests for all the other types of annotations
+    self.assertEqual(
+        "MessageAnnotation",
+        message_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.msgopt].s)
+    self.assertEqual(
+        "FieldAnnotation",
+        field_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.fieldopt].s)
+    self.assertEqual(
+        "EnumAnnotation",
+        enum_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.enumopt].s)
+    self.assertEqual(
+        "EnumValueAnnotation",
+        enum_value_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.enumvalopt].s)
+    self.assertEqual(
+        "ServiceAnnotation",
+        service_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.serviceopt].s)
+    self.assertEqual(
+        "MethodAnnotation",
+        method_descriptor.GetOptions().Extensions[
+            unittest_custom_options_pb2.methodopt].s)
+
+  def testNestedOptions(self):
+    nested_message =\
+        unittest_custom_options_pb2.NestedOptionType.NestedMessage.DESCRIPTOR
+    self.assertEqual(1001, nested_message.GetOptions().Extensions[
+        unittest_custom_options_pb2.message_opt1])
+    nested_field = nested_message.fields_by_name["nested_field"]
+    self.assertEqual(1002, nested_field.GetOptions().Extensions[
+        unittest_custom_options_pb2.field_opt1])
+    outer_message =\
+        unittest_custom_options_pb2.NestedOptionType.DESCRIPTOR
+    nested_enum = outer_message.enum_types_by_name["NestedEnum"]
+    self.assertEqual(1003, nested_enum.GetOptions().Extensions[
+        unittest_custom_options_pb2.enum_opt1])
+    nested_enum_value = outer_message.enum_values_by_name["NESTED_ENUM_VALUE"]
+    self.assertEqual(1004, nested_enum_value.GetOptions().Extensions[
+        unittest_custom_options_pb2.enum_value_opt1])
+    nested_extension = outer_message.extensions_by_name["nested_extension"]
+    self.assertEqual(1005, nested_extension.GetOptions().Extensions[
+        unittest_custom_options_pb2.field_opt2])
+
+  def testFileDescriptorReferences(self):
+    self.assertEqual(self.my_enum.file, self.my_file)
+    self.assertEqual(self.my_message.file, self.my_file)
+
+  def testFileDescriptor(self):
+    self.assertEqual(self.my_file.name, 'some/filename/some.proto')
+    self.assertEqual(self.my_file.package, 'protobuf_unittest')
+
+
+class DescriptorCopyToProtoTest(unittest.TestCase):
+  """Tests for CopyTo functions of Descriptor."""
+
+  def _AssertProtoEqual(self, actual_proto, expected_class, expected_ascii):
+    expected_proto = expected_class()
+    text_format.Merge(expected_ascii, expected_proto)
+
+    self.assertEqual(
+        actual_proto, expected_proto,
+        'Not equal,\nActual:\n%s\nExpected:\n%s\n'
+        % (str(actual_proto), str(expected_proto)))
+
+  def _InternalTestCopyToProto(self, desc, expected_proto_class,
+                               expected_proto_ascii):
+    actual = expected_proto_class()
+    desc.CopyToProto(actual)
+    self._AssertProtoEqual(
+        actual, expected_proto_class, expected_proto_ascii)
+
+  def testCopyToProto_EmptyMessage(self):
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestEmptyMessage.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_EMPTY_MESSAGE_DESCRIPTOR_ASCII)
+
+  def testCopyToProto_NestedMessage(self):
+    TEST_NESTED_MESSAGE_ASCII = """
+      name: 'NestedMessage'
+      field: <
+        name: 'bb'
+        number: 1
+        label: 1  # Optional
+        type: 5  # TYPE_INT32
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_NESTED_MESSAGE_ASCII)
+
+  def testCopyToProto_ForeignNestedMessage(self):
+    TEST_FOREIGN_NESTED_ASCII = """
+      name: 'TestForeignNested'
+      field: <
+        name: 'foreign_nested'
+        number: 1
+        label: 1  # Optional
+        type: 11  # TYPE_MESSAGE
+        type_name: '.protobuf_unittest.TestAllTypes.NestedMessage'
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestForeignNested.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_FOREIGN_NESTED_ASCII)
+
+  def testCopyToProto_ForeignEnum(self):
+    TEST_FOREIGN_ENUM_ASCII = """
+      name: 'ForeignEnum'
+      value: <
+        name: 'FOREIGN_FOO'
+        number: 4
+      >
+      value: <
+        name: 'FOREIGN_BAR'
+        number: 5
+      >
+      value: <
+        name: 'FOREIGN_BAZ'
+        number: 6
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2._FOREIGNENUM,
+        descriptor_pb2.EnumDescriptorProto,
+        TEST_FOREIGN_ENUM_ASCII)
+
+  def testCopyToProto_Options(self):
+    TEST_DEPRECATED_FIELDS_ASCII = """
+      name: 'TestDeprecatedFields'
+      field: <
+        name: 'deprecated_int32'
+        number: 1
+        label: 1  # Optional
+        type: 5  # TYPE_INT32
+        options: <
+          deprecated: true
+        >
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestDeprecatedFields.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_DEPRECATED_FIELDS_ASCII)
+
+  def testCopyToProto_AllExtensions(self):
+    TEST_EMPTY_MESSAGE_WITH_EXTENSIONS_ASCII = """
+      name: 'TestEmptyMessageWithExtensions'
+      extension_range: <
+        start: 1
+        end: 536870912
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestEmptyMessageWithExtensions.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_EMPTY_MESSAGE_WITH_EXTENSIONS_ASCII)
+
+  def testCopyToProto_SeveralExtensions(self):
+    TEST_MESSAGE_WITH_SEVERAL_EXTENSIONS_ASCII = """
+      name: 'TestMultipleExtensionRanges'
+      extension_range: <
+        start: 42
+        end: 43
+      >
+      extension_range: <
+        start: 4143
+        end: 4244
+      >
+      extension_range: <
+        start: 65536
+        end: 536870912
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestMultipleExtensionRanges.DESCRIPTOR,
+        descriptor_pb2.DescriptorProto,
+        TEST_MESSAGE_WITH_SEVERAL_EXTENSIONS_ASCII)
+
+  def testCopyToProto_FileDescriptor(self):
+    UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII = ("""
+      name: 'google/protobuf/unittest_import.proto'
+      package: 'protobuf_unittest_import'
+      dependency: 'google/protobuf/unittest_import_public.proto'
+      message_type: <
+        name: 'ImportMessage'
+        field: <
+          name: 'd'
+          number: 1
+          label: 1  # Optional
+          type: 5  # TYPE_INT32
+        >
+      >
+      """ +
+      """enum_type: <
+        name: 'ImportEnum'
+        value: <
+          name: 'IMPORT_FOO'
+          number: 7
+        >
+        value: <
+          name: 'IMPORT_BAR'
+          number: 8
+        >
+        value: <
+          name: 'IMPORT_BAZ'
+          number: 9
+        >
+      >
+      options: <
+        java_package: 'com.google.protobuf.test'
+        optimize_for: 1  # SPEED
+      >
+      public_dependency: 0
+      """)
+
+    self._InternalTestCopyToProto(
+        unittest_import_pb2.DESCRIPTOR,
+        descriptor_pb2.FileDescriptorProto,
+        UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII)
+
+  def testCopyToProto_ServiceDescriptor(self):
+    TEST_SERVICE_ASCII = """
+      name: 'TestService'
+      method: <
+        name: 'Foo'
+        input_type: '.protobuf_unittest.FooRequest'
+        output_type: '.protobuf_unittest.FooResponse'
+      >
+      method: <
+        name: 'Bar'
+        input_type: '.protobuf_unittest.BarRequest'
+        output_type: '.protobuf_unittest.BarResponse'
+      >
+      """
+
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestService.DESCRIPTOR,
+        descriptor_pb2.ServiceDescriptorProto,
+        TEST_SERVICE_ASCII)
+
+
+class MakeDescriptorTest(unittest.TestCase):
+  def testMakeDescriptorWithUnsignedIntField(self):
+    file_descriptor_proto = descriptor_pb2.FileDescriptorProto()
+    file_descriptor_proto.name = 'Foo'
+    message_type = file_descriptor_proto.message_type.add()
+    message_type.name = file_descriptor_proto.name
+    field = message_type.field.add()
+    field.number = 1
+    field.name = 'uint64_field'
+    field.label = descriptor.FieldDescriptor.LABEL_REQUIRED
+    field.type = descriptor.FieldDescriptor.TYPE_UINT64
+    result = descriptor.MakeDescriptor(message_type)
+    self.assertEqual(result.fields[0].cpp_type,
+                     descriptor.FieldDescriptor.CPPTYPE_UINT64)
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/generator_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/generator_test.py
new file mode 100755
index 0000000..8343aba
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/generator_test.py
@@ -0,0 +1,269 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+# TODO(robinson): Flesh this out considerably.  We focused on reflection_test.py
+# first, since it's testing the subtler code, and since it provides decent
+# indirect testing of the protocol compiler output.
+
+"""Unittest that directly tests the output of the pure-Python protocol
+compiler.  See //google/protobuf/reflection_test.py for a test which
+further ensures that we can use Python protocol message objects as we expect.
+"""
+
+__author__ = 'robinson@google.com (Will Robinson)'
+
+import unittest
+from google.protobuf.internal import test_bad_identifiers_pb2
+from google.protobuf import unittest_custom_options_pb2
+from google.protobuf import unittest_import_pb2
+from google.protobuf import unittest_import_public_pb2
+from google.protobuf import unittest_mset_pb2
+from google.protobuf import unittest_pb2
+from google.protobuf import unittest_no_generic_services_pb2
+from google.protobuf import service
+
+MAX_EXTENSION = 536870912
+
+
+class GeneratorTest(unittest.TestCase):
+
+  def testNestedMessageDescriptor(self):
+    field_name = 'optional_nested_message'
+    proto_type = unittest_pb2.TestAllTypes
+    self.assertEqual(
+        proto_type.NestedMessage.DESCRIPTOR,
+        proto_type.DESCRIPTOR.fields_by_name[field_name].message_type)
+
+  def testEnums(self):
+    # We test only module-level enums here.
+    # TODO(robinson): Examine descriptors directly to check
+    # enum descriptor output.
+    self.assertEqual(4, unittest_pb2.FOREIGN_FOO)
+    self.assertEqual(5, unittest_pb2.FOREIGN_BAR)
+    self.assertEqual(6, unittest_pb2.FOREIGN_BAZ)
+
+    proto = unittest_pb2.TestAllTypes()
+    self.assertEqual(1, proto.FOO)
+    self.assertEqual(1, unittest_pb2.TestAllTypes.FOO)
+    self.assertEqual(2, proto.BAR)
+    self.assertEqual(2, unittest_pb2.TestAllTypes.BAR)
+    self.assertEqual(3, proto.BAZ)
+    self.assertEqual(3, unittest_pb2.TestAllTypes.BAZ)
+
+  def testExtremeDefaultValues(self):
+    message = unittest_pb2.TestExtremeDefaultValues()
+
+    # Python pre-2.6 does not have isinf() or isnan() functions, so we have
+    # to provide our own.
+    def isnan(val):
+      # NaN is never equal to itself.
+      return val != val
+    def isinf(val):
+      # Infinity times zero equals NaN.
+      return not isnan(val) and isnan(val * 0)
+
+    self.assertTrue(isinf(message.inf_double))
+    self.assertTrue(message.inf_double > 0)
+    self.assertTrue(isinf(message.neg_inf_double))
+    self.assertTrue(message.neg_inf_double < 0)
+    self.assertTrue(isnan(message.nan_double))
+
+    self.assertTrue(isinf(message.inf_float))
+    self.assertTrue(message.inf_float > 0)
+    self.assertTrue(isinf(message.neg_inf_float))
+    self.assertTrue(message.neg_inf_float < 0)
+    self.assertTrue(isnan(message.nan_float))
+    self.assertEqual("? ? ?? ?? ??? ??/ ??-", message.cpp_trigraph)
+
+  def testHasDefaultValues(self):
+    desc = unittest_pb2.TestAllTypes.DESCRIPTOR
+
+    expected_has_default_by_name = {
+        'optional_int32': False,
+        'repeated_int32': False,
+        'optional_nested_message': False,
+        'default_int32': True,
+    }
+
+    has_default_by_name = dict(
+        [(f.name, f.has_default_value)
+         for f in desc.fields
+         if f.name in expected_has_default_by_name])
+    self.assertEqual(expected_has_default_by_name, has_default_by_name)
+
+  def testContainingTypeBehaviorForExtensions(self):
+    self.assertEqual(unittest_pb2.optional_int32_extension.containing_type,
+                     unittest_pb2.TestAllExtensions.DESCRIPTOR)
+    self.assertEqual(unittest_pb2.TestRequired.single.containing_type,
+                     unittest_pb2.TestAllExtensions.DESCRIPTOR)
+
+  def testExtensionScope(self):
+    self.assertEqual(unittest_pb2.optional_int32_extension.extension_scope,
+                     None)
+    self.assertEqual(unittest_pb2.TestRequired.single.extension_scope,
+                     unittest_pb2.TestRequired.DESCRIPTOR)
+
+  def testIsExtension(self):
+    self.assertTrue(unittest_pb2.optional_int32_extension.is_extension)
+    self.assertTrue(unittest_pb2.TestRequired.single.is_extension)
+
+    message_descriptor = unittest_pb2.TestRequired.DESCRIPTOR
+    non_extension_descriptor = message_descriptor.fields_by_name['a']
+    self.assertTrue(not non_extension_descriptor.is_extension)
+
+  def testOptions(self):
+    proto = unittest_mset_pb2.TestMessageSet()
+    self.assertTrue(proto.DESCRIPTOR.GetOptions().message_set_wire_format)
+
+  def testMessageWithCustomOptions(self):
+    proto = unittest_custom_options_pb2.TestMessageWithCustomOptions()
+    enum_options = proto.DESCRIPTOR.enum_types_by_name['AnEnum'].GetOptions()
+    self.assertTrue(enum_options is not None)
+    # TODO(gps): We really should test for the presense of the enum_opt1
+    # extension and for its value to be set to -789.
+
+  def testNestedTypes(self):
+    self.assertEquals(
+        set(unittest_pb2.TestAllTypes.DESCRIPTOR.nested_types),
+        set([
+            unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR,
+            unittest_pb2.TestAllTypes.OptionalGroup.DESCRIPTOR,
+            unittest_pb2.TestAllTypes.RepeatedGroup.DESCRIPTOR,
+        ]))
+    self.assertEqual(unittest_pb2.TestEmptyMessage.DESCRIPTOR.nested_types, [])
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR.nested_types, [])
+
+  def testContainingType(self):
+    self.assertTrue(
+        unittest_pb2.TestEmptyMessage.DESCRIPTOR.containing_type is None)
+    self.assertTrue(
+        unittest_pb2.TestAllTypes.DESCRIPTOR.containing_type is None)
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR.containing_type,
+        unittest_pb2.TestAllTypes.DESCRIPTOR)
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR.containing_type,
+        unittest_pb2.TestAllTypes.DESCRIPTOR)
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.RepeatedGroup.DESCRIPTOR.containing_type,
+        unittest_pb2.TestAllTypes.DESCRIPTOR)
+
+  def testContainingTypeInEnumDescriptor(self):
+    self.assertTrue(unittest_pb2._FOREIGNENUM.containing_type is None)
+    self.assertEqual(unittest_pb2._TESTALLTYPES_NESTEDENUM.containing_type,
+                     unittest_pb2.TestAllTypes.DESCRIPTOR)
+
+  def testPackage(self):
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.DESCRIPTOR.file.package,
+        'protobuf_unittest')
+    desc = unittest_pb2.TestAllTypes.NestedMessage.DESCRIPTOR
+    self.assertEqual(desc.file.package, 'protobuf_unittest')
+    self.assertEqual(
+        unittest_import_pb2.ImportMessage.DESCRIPTOR.file.package,
+        'protobuf_unittest_import')
+
+    self.assertEqual(
+        unittest_pb2._FOREIGNENUM.file.package, 'protobuf_unittest')
+    self.assertEqual(
+        unittest_pb2._TESTALLTYPES_NESTEDENUM.file.package,
+        'protobuf_unittest')
+    self.assertEqual(
+        unittest_import_pb2._IMPORTENUM.file.package,
+        'protobuf_unittest_import')
+
+  def testExtensionRange(self):
+    self.assertEqual(
+        unittest_pb2.TestAllTypes.DESCRIPTOR.extension_ranges, [])
+    self.assertEqual(
+        unittest_pb2.TestAllExtensions.DESCRIPTOR.extension_ranges,
+        [(1, MAX_EXTENSION)])
+    self.assertEqual(
+        unittest_pb2.TestMultipleExtensionRanges.DESCRIPTOR.extension_ranges,
+        [(42, 43), (4143, 4244), (65536, MAX_EXTENSION)])
+
+  def testFileDescriptor(self):
+    self.assertEqual(unittest_pb2.DESCRIPTOR.name,
+                     'google/protobuf/unittest.proto')
+    self.assertEqual(unittest_pb2.DESCRIPTOR.package, 'protobuf_unittest')
+    self.assertFalse(unittest_pb2.DESCRIPTOR.serialized_pb is None)
+
+  def testNoGenericServices(self):
+    self.assertTrue(hasattr(unittest_no_generic_services_pb2, "TestMessage"))
+    self.assertTrue(hasattr(unittest_no_generic_services_pb2, "FOO"))
+    self.assertTrue(hasattr(unittest_no_generic_services_pb2, "test_extension"))
+
+    # Make sure unittest_no_generic_services_pb2 has no services subclassing
+    # Proto2 Service class.
+    if hasattr(unittest_no_generic_services_pb2, "TestService"):
+      self.assertFalse(issubclass(unittest_no_generic_services_pb2.TestService,
+                                  service.Service))
+
+  def testMessageTypesByName(self):
+    file_type = unittest_pb2.DESCRIPTOR
+    self.assertEqual(
+        unittest_pb2._TESTALLTYPES,
+        file_type.message_types_by_name[unittest_pb2._TESTALLTYPES.name])
+
+    # Nested messages shouldn't be included in the message_types_by_name
+    # dictionary (like in the C++ API).
+    self.assertFalse(
+        unittest_pb2._TESTALLTYPES_NESTEDMESSAGE.name in
+        file_type.message_types_by_name)
+
+  def testPublicImports(self):
+    # Test public imports as embedded message.
+    all_type_proto = unittest_pb2.TestAllTypes()
+    self.assertEqual(0, all_type_proto.optional_public_import_message.e)
+
+    # PublicImportMessage is actually defined in unittest_import_public_pb2
+    # module, and is public imported by unittest_import_pb2 module.
+    public_import_proto = unittest_import_pb2.PublicImportMessage()
+    self.assertEqual(0, public_import_proto.e)
+    self.assertTrue(unittest_import_public_pb2.PublicImportMessage is
+                    unittest_import_pb2.PublicImportMessage)
+
+  def testBadIdentifiers(self):
+    # We're just testing that the code was imported without problems.
+    message = test_bad_identifiers_pb2.TestBadIdentifiers()
+    self.assertEqual(message.Extensions[test_bad_identifiers_pb2.message],
+                     "foo")
+    self.assertEqual(message.Extensions[test_bad_identifiers_pb2.descriptor],
+                     "bar")
+    self.assertEqual(message.Extensions[test_bad_identifiers_pb2.reflection],
+                     "baz")
+    self.assertEqual(message.Extensions[test_bad_identifiers_pb2.service],
+                     "qux")
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_message b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_message
new file mode 100644
index 0000000..4dd62cd
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_message
Binary files differ
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_packed_fields_message b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_packed_fields_message
new file mode 100644
index 0000000..ee28d38
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/golden_packed_fields_message
Binary files differ
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/message_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/message_test.py
new file mode 100755
index 0000000..e71b295
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/message_test.py
@@ -0,0 +1,499 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Tests python protocol buffers against the golden message.
+
+Note that the golden messages exercise every known field type, thus this
+test ends up exercising and verifying nearly all of the parsing and
+serialization code in the whole library.
+
+TODO(kenton):  Merge with wire_format_test?  It doesn't make a whole lot of
+sense to call this a test of the "message" module, which only declares an
+abstract interface.
+"""
+
+__author__ = 'gps@google.com (Gregory P. Smith)'
+
+import copy
+import math
+import operator
+import pickle
+
+import unittest
+from google.protobuf import unittest_import_pb2
+from google.protobuf import unittest_pb2
+from google.protobuf.internal import api_implementation
+from google.protobuf.internal import test_util
+from google.protobuf import message
+
+try:
+  cmp                                   # Python 2
+except NameError:
+  cmp = lambda x, y: (x > y) - (x < y)  # Python 3
+
+# Python pre-2.6 does not have isinf() or isnan() functions, so we have
+# to provide our own.
+def isnan(val):
+  # NaN is never equal to itself.
+  return val != val
+def isinf(val):
+  # Infinity times zero equals NaN.
+  return not isnan(val) and isnan(val * 0)
+def IsPosInf(val):
+  return isinf(val) and (val > 0)
+def IsNegInf(val):
+  return isinf(val) and (val < 0)
+
+class MessageTest(unittest.TestCase):
+
+  def testGoldenMessage(self):
+    golden_data = test_util.GoldenFile('golden_message').read()
+    golden_message = unittest_pb2.TestAllTypes()
+    golden_message.ParseFromString(golden_data)
+    test_util.ExpectAllFieldsSet(self, golden_message)
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+    golden_copy = copy.deepcopy(golden_message)
+    self.assertEqual(golden_data, golden_copy.SerializeToString())
+
+  def testGoldenExtensions(self):
+    golden_data = test_util.GoldenFile('golden_message').read()
+    golden_message = unittest_pb2.TestAllExtensions()
+    golden_message.ParseFromString(golden_data)
+    all_set = unittest_pb2.TestAllExtensions()
+    test_util.SetAllExtensions(all_set)
+    self.assertEquals(all_set, golden_message)
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+    golden_copy = copy.deepcopy(golden_message)
+    self.assertEqual(golden_data, golden_copy.SerializeToString())
+
+  def testGoldenPackedMessage(self):
+    golden_data = test_util.GoldenFile('golden_packed_fields_message').read()
+    golden_message = unittest_pb2.TestPackedTypes()
+    golden_message.ParseFromString(golden_data)
+    all_set = unittest_pb2.TestPackedTypes()
+    test_util.SetAllPackedFields(all_set)
+    self.assertEquals(all_set, golden_message)
+    self.assertEqual(golden_data, all_set.SerializeToString())
+    golden_copy = copy.deepcopy(golden_message)
+    self.assertEqual(golden_data, golden_copy.SerializeToString())
+
+  def testGoldenPackedExtensions(self):
+    golden_data = test_util.GoldenFile('golden_packed_fields_message').read()
+    golden_message = unittest_pb2.TestPackedExtensions()
+    golden_message.ParseFromString(golden_data)
+    all_set = unittest_pb2.TestPackedExtensions()
+    test_util.SetAllPackedExtensions(all_set)
+    self.assertEquals(all_set, golden_message)
+    self.assertEqual(golden_data, all_set.SerializeToString())
+    golden_copy = copy.deepcopy(golden_message)
+    self.assertEqual(golden_data, golden_copy.SerializeToString())
+
+  def testPickleSupport(self):
+    golden_data = test_util.GoldenFile('golden_message').read()
+    golden_message = unittest_pb2.TestAllTypes()
+    golden_message.ParseFromString(golden_data)
+    pickled_message = pickle.dumps(golden_message)
+
+    unpickled_message = pickle.loads(pickled_message)
+    self.assertEquals(unpickled_message, golden_message)
+
+  def testPickleIncompleteProto(self):
+    golden_message = unittest_pb2.TestRequired(a=1)
+    pickled_message = pickle.dumps(golden_message)
+
+    unpickled_message = pickle.loads(pickled_message)
+    self.assertEquals(unpickled_message, golden_message)
+    self.assertEquals(unpickled_message.a, 1)
+    # This is still an incomplete proto - so serializing should fail
+    self.assertRaises(message.EncodeError, unpickled_message.SerializeToString)
+
+  def testPositiveInfinity(self):
+    golden_data = ('\x5D\x00\x00\x80\x7F'
+                   '\x61\x00\x00\x00\x00\x00\x00\xF0\x7F'
+                   '\xCD\x02\x00\x00\x80\x7F'
+                   '\xD1\x02\x00\x00\x00\x00\x00\x00\xF0\x7F')
+    golden_message = unittest_pb2.TestAllTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(IsPosInf(golden_message.optional_float))
+    self.assertTrue(IsPosInf(golden_message.optional_double))
+    self.assertTrue(IsPosInf(golden_message.repeated_float[0]))
+    self.assertTrue(IsPosInf(golden_message.repeated_double[0]))
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+
+  def testNegativeInfinity(self):
+    golden_data = ('\x5D\x00\x00\x80\xFF'
+                   '\x61\x00\x00\x00\x00\x00\x00\xF0\xFF'
+                   '\xCD\x02\x00\x00\x80\xFF'
+                   '\xD1\x02\x00\x00\x00\x00\x00\x00\xF0\xFF')
+    golden_message = unittest_pb2.TestAllTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(IsNegInf(golden_message.optional_float))
+    self.assertTrue(IsNegInf(golden_message.optional_double))
+    self.assertTrue(IsNegInf(golden_message.repeated_float[0]))
+    self.assertTrue(IsNegInf(golden_message.repeated_double[0]))
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+
+  def testNotANumber(self):
+    golden_data = ('\x5D\x00\x00\xC0\x7F'
+                   '\x61\x00\x00\x00\x00\x00\x00\xF8\x7F'
+                   '\xCD\x02\x00\x00\xC0\x7F'
+                   '\xD1\x02\x00\x00\x00\x00\x00\x00\xF8\x7F')
+    golden_message = unittest_pb2.TestAllTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(isnan(golden_message.optional_float))
+    self.assertTrue(isnan(golden_message.optional_double))
+    self.assertTrue(isnan(golden_message.repeated_float[0]))
+    self.assertTrue(isnan(golden_message.repeated_double[0]))
+
+    # The protocol buffer may serialize to any one of multiple different
+    # representations of a NaN.  Rather than verify a specific representation,
+    # verify the serialized string can be converted into a correctly
+    # behaving protocol buffer.
+    serialized = golden_message.SerializeToString()
+    message = unittest_pb2.TestAllTypes()
+    message.ParseFromString(serialized)
+    self.assertTrue(isnan(message.optional_float))
+    self.assertTrue(isnan(message.optional_double))
+    self.assertTrue(isnan(message.repeated_float[0]))
+    self.assertTrue(isnan(message.repeated_double[0]))
+
+  def testPositiveInfinityPacked(self):
+    golden_data = ('\xA2\x06\x04\x00\x00\x80\x7F'
+                   '\xAA\x06\x08\x00\x00\x00\x00\x00\x00\xF0\x7F')
+    golden_message = unittest_pb2.TestPackedTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(IsPosInf(golden_message.packed_float[0]))
+    self.assertTrue(IsPosInf(golden_message.packed_double[0]))
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+
+  def testNegativeInfinityPacked(self):
+    golden_data = ('\xA2\x06\x04\x00\x00\x80\xFF'
+                   '\xAA\x06\x08\x00\x00\x00\x00\x00\x00\xF0\xFF')
+    golden_message = unittest_pb2.TestPackedTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(IsNegInf(golden_message.packed_float[0]))
+    self.assertTrue(IsNegInf(golden_message.packed_double[0]))
+    self.assertEqual(golden_data, golden_message.SerializeToString())
+
+  def testNotANumberPacked(self):
+    golden_data = ('\xA2\x06\x04\x00\x00\xC0\x7F'
+                   '\xAA\x06\x08\x00\x00\x00\x00\x00\x00\xF8\x7F')
+    golden_message = unittest_pb2.TestPackedTypes()
+    golden_message.ParseFromString(golden_data)
+    self.assertTrue(isnan(golden_message.packed_float[0]))
+    self.assertTrue(isnan(golden_message.packed_double[0]))
+
+    serialized = golden_message.SerializeToString()
+    message = unittest_pb2.TestPackedTypes()
+    message.ParseFromString(serialized)
+    self.assertTrue(isnan(message.packed_float[0]))
+    self.assertTrue(isnan(message.packed_double[0]))
+
+  def testExtremeFloatValues(self):
+    message = unittest_pb2.TestAllTypes()
+
+    # Most positive exponent, no significand bits set.
+    kMostPosExponentNoSigBits = math.pow(2, 127)
+    message.optional_float = kMostPosExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == kMostPosExponentNoSigBits)
+
+    # Most positive exponent, one significand bit set.
+    kMostPosExponentOneSigBit = 1.5 * math.pow(2, 127)
+    message.optional_float = kMostPosExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == kMostPosExponentOneSigBit)
+
+    # Repeat last two cases with values of same magnitude, but negative.
+    message.optional_float = -kMostPosExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == -kMostPosExponentNoSigBits)
+
+    message.optional_float = -kMostPosExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == -kMostPosExponentOneSigBit)
+
+    # Most negative exponent, no significand bits set.
+    kMostNegExponentNoSigBits = math.pow(2, -127)
+    message.optional_float = kMostNegExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == kMostNegExponentNoSigBits)
+
+    # Most negative exponent, one significand bit set.
+    kMostNegExponentOneSigBit = 1.5 * math.pow(2, -127)
+    message.optional_float = kMostNegExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == kMostNegExponentOneSigBit)
+
+    # Repeat last two cases with values of the same magnitude, but negative.
+    message.optional_float = -kMostNegExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == -kMostNegExponentNoSigBits)
+
+    message.optional_float = -kMostNegExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_float == -kMostNegExponentOneSigBit)
+
+  def testExtremeDoubleValues(self):
+    message = unittest_pb2.TestAllTypes()
+
+    # Most positive exponent, no significand bits set.
+    kMostPosExponentNoSigBits = math.pow(2, 1023)
+    message.optional_double = kMostPosExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == kMostPosExponentNoSigBits)
+
+    # Most positive exponent, one significand bit set.
+    kMostPosExponentOneSigBit = 1.5 * math.pow(2, 1023)
+    message.optional_double = kMostPosExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == kMostPosExponentOneSigBit)
+
+    # Repeat last two cases with values of same magnitude, but negative.
+    message.optional_double = -kMostPosExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == -kMostPosExponentNoSigBits)
+
+    message.optional_double = -kMostPosExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == -kMostPosExponentOneSigBit)
+
+    # Most negative exponent, no significand bits set.
+    kMostNegExponentNoSigBits = math.pow(2, -1023)
+    message.optional_double = kMostNegExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == kMostNegExponentNoSigBits)
+
+    # Most negative exponent, one significand bit set.
+    kMostNegExponentOneSigBit = 1.5 * math.pow(2, -1023)
+    message.optional_double = kMostNegExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == kMostNegExponentOneSigBit)
+
+    # Repeat last two cases with values of the same magnitude, but negative.
+    message.optional_double = -kMostNegExponentNoSigBits
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == -kMostNegExponentNoSigBits)
+
+    message.optional_double = -kMostNegExponentOneSigBit
+    message.ParseFromString(message.SerializeToString())
+    self.assertTrue(message.optional_double == -kMostNegExponentOneSigBit)
+
+  def testSortingRepeatedScalarFieldsDefaultComparator(self):
+    """Check some different types with the default comparator."""
+    message = unittest_pb2.TestAllTypes()
+
+    # TODO(mattp): would testing more scalar types strengthen test?
+    message.repeated_int32.append(1)
+    message.repeated_int32.append(3)
+    message.repeated_int32.append(2)
+    message.repeated_int32.sort()
+    self.assertEqual(message.repeated_int32[0], 1)
+    self.assertEqual(message.repeated_int32[1], 2)
+    self.assertEqual(message.repeated_int32[2], 3)
+
+    message.repeated_float.append(1.1)
+    message.repeated_float.append(1.3)
+    message.repeated_float.append(1.2)
+    message.repeated_float.sort()
+    self.assertAlmostEqual(message.repeated_float[0], 1.1)
+    self.assertAlmostEqual(message.repeated_float[1], 1.2)
+    self.assertAlmostEqual(message.repeated_float[2], 1.3)
+
+    message.repeated_string.append('a')
+    message.repeated_string.append('c')
+    message.repeated_string.append('b')
+    message.repeated_string.sort()
+    self.assertEqual(message.repeated_string[0], 'a')
+    self.assertEqual(message.repeated_string[1], 'b')
+    self.assertEqual(message.repeated_string[2], 'c')
+
+    message.repeated_bytes.append('a')
+    message.repeated_bytes.append('c')
+    message.repeated_bytes.append('b')
+    message.repeated_bytes.sort()
+    self.assertEqual(message.repeated_bytes[0], 'a')
+    self.assertEqual(message.repeated_bytes[1], 'b')
+    self.assertEqual(message.repeated_bytes[2], 'c')
+
+  def testSortingRepeatedScalarFieldsCustomComparator(self):
+    """Check some different types with custom comparator."""
+    message = unittest_pb2.TestAllTypes()
+
+    message.repeated_int32.append(-3)
+    message.repeated_int32.append(-2)
+    message.repeated_int32.append(-1)
+    message.repeated_int32.sort(lambda x,y: cmp(abs(x), abs(y)))
+    self.assertEqual(message.repeated_int32[0], -1)
+    self.assertEqual(message.repeated_int32[1], -2)
+    self.assertEqual(message.repeated_int32[2], -3)
+
+    message.repeated_string.append('aaa')
+    message.repeated_string.append('bb')
+    message.repeated_string.append('c')
+    message.repeated_string.sort(lambda x,y: cmp(len(x), len(y)))
+    self.assertEqual(message.repeated_string[0], 'c')
+    self.assertEqual(message.repeated_string[1], 'bb')
+    self.assertEqual(message.repeated_string[2], 'aaa')
+
+  def testSortingRepeatedCompositeFieldsCustomComparator(self):
+    """Check passing a custom comparator to sort a repeated composite field."""
+    message = unittest_pb2.TestAllTypes()
+
+    message.repeated_nested_message.add().bb = 1
+    message.repeated_nested_message.add().bb = 3
+    message.repeated_nested_message.add().bb = 2
+    message.repeated_nested_message.add().bb = 6
+    message.repeated_nested_message.add().bb = 5
+    message.repeated_nested_message.add().bb = 4
+    message.repeated_nested_message.sort(lambda x,y: cmp(x.bb, y.bb))
+    self.assertEqual(message.repeated_nested_message[0].bb, 1)
+    self.assertEqual(message.repeated_nested_message[1].bb, 2)
+    self.assertEqual(message.repeated_nested_message[2].bb, 3)
+    self.assertEqual(message.repeated_nested_message[3].bb, 4)
+    self.assertEqual(message.repeated_nested_message[4].bb, 5)
+    self.assertEqual(message.repeated_nested_message[5].bb, 6)
+
+  def testRepeatedCompositeFieldSortArguments(self):
+    """Check sorting a repeated composite field using list.sort() arguments."""
+    message = unittest_pb2.TestAllTypes()
+
+    get_bb = operator.attrgetter('bb')
+    cmp_bb = lambda a, b: cmp(a.bb, b.bb)
+    message.repeated_nested_message.add().bb = 1
+    message.repeated_nested_message.add().bb = 3
+    message.repeated_nested_message.add().bb = 2
+    message.repeated_nested_message.add().bb = 6
+    message.repeated_nested_message.add().bb = 5
+    message.repeated_nested_message.add().bb = 4
+    message.repeated_nested_message.sort(key=get_bb)
+    self.assertEqual([k.bb for k in message.repeated_nested_message],
+                     [1, 2, 3, 4, 5, 6])
+    message.repeated_nested_message.sort(key=get_bb, reverse=True)
+    self.assertEqual([k.bb for k in message.repeated_nested_message],
+                     [6, 5, 4, 3, 2, 1])
+    message.repeated_nested_message.sort(sort_function=cmp_bb)
+    self.assertEqual([k.bb for k in message.repeated_nested_message],
+                     [1, 2, 3, 4, 5, 6])
+    message.repeated_nested_message.sort(cmp=cmp_bb, reverse=True)
+    self.assertEqual([k.bb for k in message.repeated_nested_message],
+                     [6, 5, 4, 3, 2, 1])
+
+  def testRepeatedScalarFieldSortArguments(self):
+    """Check sorting a scalar field using list.sort() arguments."""
+    message = unittest_pb2.TestAllTypes()
+
+    abs_cmp = lambda a, b: cmp(abs(a), abs(b))
+    message.repeated_int32.append(-3)
+    message.repeated_int32.append(-2)
+    message.repeated_int32.append(-1)
+    message.repeated_int32.sort(key=abs)
+    self.assertEqual(list(message.repeated_int32), [-1, -2, -3])
+    message.repeated_int32.sort(key=abs, reverse=True)
+    self.assertEqual(list(message.repeated_int32), [-3, -2, -1])
+    message.repeated_int32.sort(sort_function=abs_cmp)
+    self.assertEqual(list(message.repeated_int32), [-1, -2, -3])
+    message.repeated_int32.sort(cmp=abs_cmp, reverse=True)
+    self.assertEqual(list(message.repeated_int32), [-3, -2, -1])
+
+    len_cmp = lambda a, b: cmp(len(a), len(b))
+    message.repeated_string.append('aaa')
+    message.repeated_string.append('bb')
+    message.repeated_string.append('c')
+    message.repeated_string.sort(key=len)
+    self.assertEqual(list(message.repeated_string), ['c', 'bb', 'aaa'])
+    message.repeated_string.sort(key=len, reverse=True)
+    self.assertEqual(list(message.repeated_string), ['aaa', 'bb', 'c'])
+    message.repeated_string.sort(sort_function=len_cmp)
+    self.assertEqual(list(message.repeated_string), ['c', 'bb', 'aaa'])
+    message.repeated_string.sort(cmp=len_cmp, reverse=True)
+    self.assertEqual(list(message.repeated_string), ['aaa', 'bb', 'c'])
+
+  def testParsingMerge(self):
+    """Check the merge behavior when a required or optional field appears
+    multiple times in the input."""
+    messages = [
+        unittest_pb2.TestAllTypes(),
+        unittest_pb2.TestAllTypes(),
+        unittest_pb2.TestAllTypes() ]
+    messages[0].optional_int32 = 1
+    messages[1].optional_int64 = 2
+    messages[2].optional_int32 = 3
+    messages[2].optional_string = 'hello'
+
+    merged_message = unittest_pb2.TestAllTypes()
+    merged_message.optional_int32 = 3
+    merged_message.optional_int64 = 2
+    merged_message.optional_string = 'hello'
+
+    generator = unittest_pb2.TestParsingMerge.RepeatedFieldsGenerator()
+    generator.field1.extend(messages)
+    generator.field2.extend(messages)
+    generator.field3.extend(messages)
+    generator.ext1.extend(messages)
+    generator.ext2.extend(messages)
+    generator.group1.add().field1.MergeFrom(messages[0])
+    generator.group1.add().field1.MergeFrom(messages[1])
+    generator.group1.add().field1.MergeFrom(messages[2])
+    generator.group2.add().field1.MergeFrom(messages[0])
+    generator.group2.add().field1.MergeFrom(messages[1])
+    generator.group2.add().field1.MergeFrom(messages[2])
+
+    data = generator.SerializeToString()
+    parsing_merge = unittest_pb2.TestParsingMerge()
+    parsing_merge.ParseFromString(data)
+
+    # Required and optional fields should be merged.
+    self.assertEqual(parsing_merge.required_all_types, merged_message)
+    self.assertEqual(parsing_merge.optional_all_types, merged_message)
+    self.assertEqual(parsing_merge.optionalgroup.optional_group_all_types,
+                     merged_message)
+    self.assertEqual(parsing_merge.Extensions[
+                     unittest_pb2.TestParsingMerge.optional_ext],
+                     merged_message)
+
+    # Repeated fields should not be merged.
+    self.assertEqual(len(parsing_merge.repeated_all_types), 3)
+    self.assertEqual(len(parsing_merge.repeatedgroup), 3)
+    self.assertEqual(len(parsing_merge.Extensions[
+        unittest_pb2.TestParsingMerge.repeated_ext]), 3)
+
+
+  def testSortEmptyRepeatedCompositeContainer(self):
+    """Exercise a scenario that has led to segfaults in the past.
+    """
+    m = unittest_pb2.TestAllTypes()
+    m.repeated_nested_message.sort()
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/service_reflection_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/service_reflection_test.py
new file mode 100755
index 0000000..e04f825
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/service_reflection_test.py
@@ -0,0 +1,136 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Tests for google.protobuf.internal.service_reflection."""
+
+__author__ = 'petar@google.com (Petar Petrov)'
+
+import unittest
+from google.protobuf import unittest_pb2
+from google.protobuf import service_reflection
+from google.protobuf import service
+
+
+class FooUnitTest(unittest.TestCase):
+
+  def testService(self):
+    class MockRpcChannel(service.RpcChannel):
+      def CallMethod(self, method, controller, request, response, callback):
+        self.method = method
+        self.controller = controller
+        self.request = request
+        callback(response)
+
+    class MockRpcController(service.RpcController):
+      def SetFailed(self, msg):
+        self.failure_message = msg
+
+    self.callback_response = None
+
+    class MyService(unittest_pb2.TestService):
+      pass
+
+    self.callback_response = None
+
+    def MyCallback(response):
+      self.callback_response = response
+
+    rpc_controller = MockRpcController()
+    channel = MockRpcChannel()
+    srvc = MyService()
+    srvc.Foo(rpc_controller, unittest_pb2.FooRequest(), MyCallback)
+    self.assertEqual('Method Foo not implemented.',
+                     rpc_controller.failure_message)
+    self.assertEqual(None, self.callback_response)
+
+    rpc_controller.failure_message = None
+
+    service_descriptor = unittest_pb2.TestService.GetDescriptor()
+    srvc.CallMethod(service_descriptor.methods[1], rpc_controller,
+                    unittest_pb2.BarRequest(), MyCallback)
+    self.assertEqual('Method Bar not implemented.',
+                     rpc_controller.failure_message)
+    self.assertEqual(None, self.callback_response)
+    
+    class MyServiceImpl(unittest_pb2.TestService):
+      def Foo(self, rpc_controller, request, done):
+        self.foo_called = True
+      def Bar(self, rpc_controller, request, done):
+        self.bar_called = True
+
+    srvc = MyServiceImpl()
+    rpc_controller.failure_message = None
+    srvc.Foo(rpc_controller, unittest_pb2.FooRequest(), MyCallback)
+    self.assertEqual(None, rpc_controller.failure_message)
+    self.assertEqual(True, srvc.foo_called)
+
+    rpc_controller.failure_message = None
+    srvc.CallMethod(service_descriptor.methods[1], rpc_controller,
+                    unittest_pb2.BarRequest(), MyCallback)
+    self.assertEqual(None, rpc_controller.failure_message)
+    self.assertEqual(True, srvc.bar_called)
+
+  def testServiceStub(self):
+    class MockRpcChannel(service.RpcChannel):
+      def CallMethod(self, method, controller, request,
+                     response_class, callback):
+        self.method = method
+        self.controller = controller
+        self.request = request
+        callback(response_class())
+
+    self.callback_response = None
+
+    def MyCallback(response):
+      self.callback_response = response
+
+    channel = MockRpcChannel()
+    stub = unittest_pb2.TestService_Stub(channel)
+    rpc_controller = 'controller'
+    request = 'request'
+
+    # GetDescriptor now static, still works as instance method for compatability
+    self.assertEqual(unittest_pb2.TestService_Stub.GetDescriptor(),
+                     stub.GetDescriptor())
+
+    # Invoke method.
+    stub.Foo(rpc_controller, request, MyCallback)
+
+    self.assertTrue(isinstance(self.callback_response,
+                               unittest_pb2.FooResponse))
+    self.assertEqual(request, channel.request)
+    self.assertEqual(rpc_controller, channel.controller)
+    self.assertEqual(stub.GetDescriptor().methods[0], channel.method)
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/test_util.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/test_util.py
new file mode 100755
index 0000000..e2c9db0
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/test_util.py
@@ -0,0 +1,651 @@
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Utilities for Python proto2 tests.
+
+This is intentionally modeled on C++ code in
+//google/protobuf/test_util.*.
+"""
+
+__author__ = 'robinson@google.com (Will Robinson)'
+
+import os.path
+
+from google.protobuf import unittest_import_pb2
+from google.protobuf import unittest_pb2
+
+
+def SetAllNonLazyFields(message):
+  """Sets every non-lazy field in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestAllTypes instance.
+  """
+
+  #
+  # Optional fields.
+  #
+
+  message.optional_int32    = 101
+  message.optional_int64    = 102
+  message.optional_uint32   = 103
+  message.optional_uint64   = 104
+  message.optional_sint32   = 105
+  message.optional_sint64   = 106
+  message.optional_fixed32  = 107
+  message.optional_fixed64  = 108
+  message.optional_sfixed32 = 109
+  message.optional_sfixed64 = 110
+  message.optional_float    = 111
+  message.optional_double   = 112
+  message.optional_bool     = True
+  # TODO(robinson): Firmly spec out and test how
+  # protos interact with unicode.  One specific example:
+  # what happens if we change the literal below to
+  # u'115'?  What *should* happen?  Still some discussion
+  # to finish with Kenton about bytes vs. strings
+  # and forcing everything to be utf8. :-/
+  message.optional_string   = '115'
+  message.optional_bytes    = '116'
+
+  message.optionalgroup.a = 117
+  message.optional_nested_message.bb = 118
+  message.optional_foreign_message.c = 119
+  message.optional_import_message.d = 120
+  message.optional_public_import_message.e = 126
+
+  message.optional_nested_enum = unittest_pb2.TestAllTypes.BAZ
+  message.optional_foreign_enum = unittest_pb2.FOREIGN_BAZ
+  message.optional_import_enum = unittest_import_pb2.IMPORT_BAZ
+
+  message.optional_string_piece = '124'
+  message.optional_cord = '125'
+
+  #
+  # Repeated fields.
+  #
+
+  message.repeated_int32.append(201)
+  message.repeated_int64.append(202)
+  message.repeated_uint32.append(203)
+  message.repeated_uint64.append(204)
+  message.repeated_sint32.append(205)
+  message.repeated_sint64.append(206)
+  message.repeated_fixed32.append(207)
+  message.repeated_fixed64.append(208)
+  message.repeated_sfixed32.append(209)
+  message.repeated_sfixed64.append(210)
+  message.repeated_float.append(211)
+  message.repeated_double.append(212)
+  message.repeated_bool.append(True)
+  message.repeated_string.append('215')
+  message.repeated_bytes.append('216')
+
+  message.repeatedgroup.add().a = 217
+  message.repeated_nested_message.add().bb = 218
+  message.repeated_foreign_message.add().c = 219
+  message.repeated_import_message.add().d = 220
+  message.repeated_lazy_message.add().bb = 227
+
+  message.repeated_nested_enum.append(unittest_pb2.TestAllTypes.BAR)
+  message.repeated_foreign_enum.append(unittest_pb2.FOREIGN_BAR)
+  message.repeated_import_enum.append(unittest_import_pb2.IMPORT_BAR)
+
+  message.repeated_string_piece.append('224')
+  message.repeated_cord.append('225')
+
+  # Add a second one of each field.
+  message.repeated_int32.append(301)
+  message.repeated_int64.append(302)
+  message.repeated_uint32.append(303)
+  message.repeated_uint64.append(304)
+  message.repeated_sint32.append(305)
+  message.repeated_sint64.append(306)
+  message.repeated_fixed32.append(307)
+  message.repeated_fixed64.append(308)
+  message.repeated_sfixed32.append(309)
+  message.repeated_sfixed64.append(310)
+  message.repeated_float.append(311)
+  message.repeated_double.append(312)
+  message.repeated_bool.append(False)
+  message.repeated_string.append('315')
+  message.repeated_bytes.append('316')
+
+  message.repeatedgroup.add().a = 317
+  message.repeated_nested_message.add().bb = 318
+  message.repeated_foreign_message.add().c = 319
+  message.repeated_import_message.add().d = 320
+  message.repeated_lazy_message.add().bb = 327
+
+  message.repeated_nested_enum.append(unittest_pb2.TestAllTypes.BAZ)
+  message.repeated_foreign_enum.append(unittest_pb2.FOREIGN_BAZ)
+  message.repeated_import_enum.append(unittest_import_pb2.IMPORT_BAZ)
+
+  message.repeated_string_piece.append('324')
+  message.repeated_cord.append('325')
+
+  #
+  # Fields that have defaults.
+  #
+
+  message.default_int32 = 401
+  message.default_int64 = 402
+  message.default_uint32 = 403
+  message.default_uint64 = 404
+  message.default_sint32 = 405
+  message.default_sint64 = 406
+  message.default_fixed32 = 407
+  message.default_fixed64 = 408
+  message.default_sfixed32 = 409
+  message.default_sfixed64 = 410
+  message.default_float = 411
+  message.default_double = 412
+  message.default_bool = False
+  message.default_string = '415'
+  message.default_bytes = '416'
+
+  message.default_nested_enum = unittest_pb2.TestAllTypes.FOO
+  message.default_foreign_enum = unittest_pb2.FOREIGN_FOO
+  message.default_import_enum = unittest_import_pb2.IMPORT_FOO
+
+  message.default_string_piece = '424'
+  message.default_cord = '425'
+
+
+def SetAllFields(message):
+  SetAllNonLazyFields(message)
+  message.optional_lazy_message.bb = 127
+
+
+def SetAllExtensions(message):
+  """Sets every extension in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestAllExtensions instance.
+  """
+
+  extensions = message.Extensions
+  pb2 = unittest_pb2
+  import_pb2 = unittest_import_pb2
+
+  #
+  # Optional fields.
+  #
+
+  extensions[pb2.optional_int32_extension] = 101
+  extensions[pb2.optional_int64_extension] = 102
+  extensions[pb2.optional_uint32_extension] = 103
+  extensions[pb2.optional_uint64_extension] = 104
+  extensions[pb2.optional_sint32_extension] = 105
+  extensions[pb2.optional_sint64_extension] = 106
+  extensions[pb2.optional_fixed32_extension] = 107
+  extensions[pb2.optional_fixed64_extension] = 108
+  extensions[pb2.optional_sfixed32_extension] = 109
+  extensions[pb2.optional_sfixed64_extension] = 110
+  extensions[pb2.optional_float_extension] = 111
+  extensions[pb2.optional_double_extension] = 112
+  extensions[pb2.optional_bool_extension] = True
+  extensions[pb2.optional_string_extension] = '115'
+  extensions[pb2.optional_bytes_extension] = '116'
+
+  extensions[pb2.optionalgroup_extension].a = 117
+  extensions[pb2.optional_nested_message_extension].bb = 118
+  extensions[pb2.optional_foreign_message_extension].c = 119
+  extensions[pb2.optional_import_message_extension].d = 120
+  extensions[pb2.optional_public_import_message_extension].e = 126
+  extensions[pb2.optional_lazy_message_extension].bb = 127
+
+  extensions[pb2.optional_nested_enum_extension] = pb2.TestAllTypes.BAZ
+  extensions[pb2.optional_nested_enum_extension] = pb2.TestAllTypes.BAZ
+  extensions[pb2.optional_foreign_enum_extension] = pb2.FOREIGN_BAZ
+  extensions[pb2.optional_import_enum_extension] = import_pb2.IMPORT_BAZ
+
+  extensions[pb2.optional_string_piece_extension] = '124'
+  extensions[pb2.optional_cord_extension] = '125'
+
+  #
+  # Repeated fields.
+  #
+
+  extensions[pb2.repeated_int32_extension].append(201)
+  extensions[pb2.repeated_int64_extension].append(202)
+  extensions[pb2.repeated_uint32_extension].append(203)
+  extensions[pb2.repeated_uint64_extension].append(204)
+  extensions[pb2.repeated_sint32_extension].append(205)
+  extensions[pb2.repeated_sint64_extension].append(206)
+  extensions[pb2.repeated_fixed32_extension].append(207)
+  extensions[pb2.repeated_fixed64_extension].append(208)
+  extensions[pb2.repeated_sfixed32_extension].append(209)
+  extensions[pb2.repeated_sfixed64_extension].append(210)
+  extensions[pb2.repeated_float_extension].append(211)
+  extensions[pb2.repeated_double_extension].append(212)
+  extensions[pb2.repeated_bool_extension].append(True)
+  extensions[pb2.repeated_string_extension].append('215')
+  extensions[pb2.repeated_bytes_extension].append('216')
+
+  extensions[pb2.repeatedgroup_extension].add().a = 217
+  extensions[pb2.repeated_nested_message_extension].add().bb = 218
+  extensions[pb2.repeated_foreign_message_extension].add().c = 219
+  extensions[pb2.repeated_import_message_extension].add().d = 220
+  extensions[pb2.repeated_lazy_message_extension].add().bb = 227
+
+  extensions[pb2.repeated_nested_enum_extension].append(pb2.TestAllTypes.BAR)
+  extensions[pb2.repeated_foreign_enum_extension].append(pb2.FOREIGN_BAR)
+  extensions[pb2.repeated_import_enum_extension].append(import_pb2.IMPORT_BAR)
+
+  extensions[pb2.repeated_string_piece_extension].append('224')
+  extensions[pb2.repeated_cord_extension].append('225')
+
+  # Append a second one of each field.
+  extensions[pb2.repeated_int32_extension].append(301)
+  extensions[pb2.repeated_int64_extension].append(302)
+  extensions[pb2.repeated_uint32_extension].append(303)
+  extensions[pb2.repeated_uint64_extension].append(304)
+  extensions[pb2.repeated_sint32_extension].append(305)
+  extensions[pb2.repeated_sint64_extension].append(306)
+  extensions[pb2.repeated_fixed32_extension].append(307)
+  extensions[pb2.repeated_fixed64_extension].append(308)
+  extensions[pb2.repeated_sfixed32_extension].append(309)
+  extensions[pb2.repeated_sfixed64_extension].append(310)
+  extensions[pb2.repeated_float_extension].append(311)
+  extensions[pb2.repeated_double_extension].append(312)
+  extensions[pb2.repeated_bool_extension].append(False)
+  extensions[pb2.repeated_string_extension].append('315')
+  extensions[pb2.repeated_bytes_extension].append('316')
+
+  extensions[pb2.repeatedgroup_extension].add().a = 317
+  extensions[pb2.repeated_nested_message_extension].add().bb = 318
+  extensions[pb2.repeated_foreign_message_extension].add().c = 319
+  extensions[pb2.repeated_import_message_extension].add().d = 320
+  extensions[pb2.repeated_lazy_message_extension].add().bb = 327
+
+  extensions[pb2.repeated_nested_enum_extension].append(pb2.TestAllTypes.BAZ)
+  extensions[pb2.repeated_foreign_enum_extension].append(pb2.FOREIGN_BAZ)
+  extensions[pb2.repeated_import_enum_extension].append(import_pb2.IMPORT_BAZ)
+
+  extensions[pb2.repeated_string_piece_extension].append('324')
+  extensions[pb2.repeated_cord_extension].append('325')
+
+  #
+  # Fields with defaults.
+  #
+
+  extensions[pb2.default_int32_extension] = 401
+  extensions[pb2.default_int64_extension] = 402
+  extensions[pb2.default_uint32_extension] = 403
+  extensions[pb2.default_uint64_extension] = 404
+  extensions[pb2.default_sint32_extension] = 405
+  extensions[pb2.default_sint64_extension] = 406
+  extensions[pb2.default_fixed32_extension] = 407
+  extensions[pb2.default_fixed64_extension] = 408
+  extensions[pb2.default_sfixed32_extension] = 409
+  extensions[pb2.default_sfixed64_extension] = 410
+  extensions[pb2.default_float_extension] = 411
+  extensions[pb2.default_double_extension] = 412
+  extensions[pb2.default_bool_extension] = False
+  extensions[pb2.default_string_extension] = '415'
+  extensions[pb2.default_bytes_extension] = '416'
+
+  extensions[pb2.default_nested_enum_extension] = pb2.TestAllTypes.FOO
+  extensions[pb2.default_foreign_enum_extension] = pb2.FOREIGN_FOO
+  extensions[pb2.default_import_enum_extension] = import_pb2.IMPORT_FOO
+
+  extensions[pb2.default_string_piece_extension] = '424'
+  extensions[pb2.default_cord_extension] = '425'
+
+
+def SetAllFieldsAndExtensions(message):
+  """Sets every field and extension in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestAllExtensions message.
+  """
+  message.my_int = 1
+  message.my_string = 'foo'
+  message.my_float = 1.0
+  message.Extensions[unittest_pb2.my_extension_int] = 23
+  message.Extensions[unittest_pb2.my_extension_string] = 'bar'
+
+
+def ExpectAllFieldsAndExtensionsInOrder(serialized):
+  """Ensures that serialized is the serialization we expect for a message
+  filled with SetAllFieldsAndExtensions().  (Specifically, ensures that the
+  serialization is in canonical, tag-number order).
+  """
+  my_extension_int = unittest_pb2.my_extension_int
+  my_extension_string = unittest_pb2.my_extension_string
+  expected_strings = []
+  message = unittest_pb2.TestFieldOrderings()
+  message.my_int = 1  # Field 1.
+  expected_strings.append(message.SerializeToString())
+  message.Clear()
+  message.Extensions[my_extension_int] = 23  # Field 5.
+  expected_strings.append(message.SerializeToString())
+  message.Clear()
+  message.my_string = 'foo'  # Field 11.
+  expected_strings.append(message.SerializeToString())
+  message.Clear()
+  message.Extensions[my_extension_string] = 'bar'  # Field 50.
+  expected_strings.append(message.SerializeToString())
+  message.Clear()
+  message.my_float = 1.0
+  expected_strings.append(message.SerializeToString())
+  message.Clear()
+  expected = ''.join(expected_strings)
+
+  if expected != serialized:
+    raise ValueError('Expected %r, found %r' % (expected, serialized))
+
+
+def ExpectAllFieldsSet(test_case, message):
+  """Check all fields for correct values have after Set*Fields() is called."""
+  test_case.assertTrue(message.HasField('optional_int32'))
+  test_case.assertTrue(message.HasField('optional_int64'))
+  test_case.assertTrue(message.HasField('optional_uint32'))
+  test_case.assertTrue(message.HasField('optional_uint64'))
+  test_case.assertTrue(message.HasField('optional_sint32'))
+  test_case.assertTrue(message.HasField('optional_sint64'))
+  test_case.assertTrue(message.HasField('optional_fixed32'))
+  test_case.assertTrue(message.HasField('optional_fixed64'))
+  test_case.assertTrue(message.HasField('optional_sfixed32'))
+  test_case.assertTrue(message.HasField('optional_sfixed64'))
+  test_case.assertTrue(message.HasField('optional_float'))
+  test_case.assertTrue(message.HasField('optional_double'))
+  test_case.assertTrue(message.HasField('optional_bool'))
+  test_case.assertTrue(message.HasField('optional_string'))
+  test_case.assertTrue(message.HasField('optional_bytes'))
+
+  test_case.assertTrue(message.HasField('optionalgroup'))
+  test_case.assertTrue(message.HasField('optional_nested_message'))
+  test_case.assertTrue(message.HasField('optional_foreign_message'))
+  test_case.assertTrue(message.HasField('optional_import_message'))
+
+  test_case.assertTrue(message.optionalgroup.HasField('a'))
+  test_case.assertTrue(message.optional_nested_message.HasField('bb'))
+  test_case.assertTrue(message.optional_foreign_message.HasField('c'))
+  test_case.assertTrue(message.optional_import_message.HasField('d'))
+
+  test_case.assertTrue(message.HasField('optional_nested_enum'))
+  test_case.assertTrue(message.HasField('optional_foreign_enum'))
+  test_case.assertTrue(message.HasField('optional_import_enum'))
+
+  test_case.assertTrue(message.HasField('optional_string_piece'))
+  test_case.assertTrue(message.HasField('optional_cord'))
+
+  test_case.assertEqual(101, message.optional_int32)
+  test_case.assertEqual(102, message.optional_int64)
+  test_case.assertEqual(103, message.optional_uint32)
+  test_case.assertEqual(104, message.optional_uint64)
+  test_case.assertEqual(105, message.optional_sint32)
+  test_case.assertEqual(106, message.optional_sint64)
+  test_case.assertEqual(107, message.optional_fixed32)
+  test_case.assertEqual(108, message.optional_fixed64)
+  test_case.assertEqual(109, message.optional_sfixed32)
+  test_case.assertEqual(110, message.optional_sfixed64)
+  test_case.assertEqual(111, message.optional_float)
+  test_case.assertEqual(112, message.optional_double)
+  test_case.assertEqual(True, message.optional_bool)
+  test_case.assertEqual('115', message.optional_string)
+  test_case.assertEqual('116', message.optional_bytes)
+
+  test_case.assertEqual(117, message.optionalgroup.a)
+  test_case.assertEqual(118, message.optional_nested_message.bb)
+  test_case.assertEqual(119, message.optional_foreign_message.c)
+  test_case.assertEqual(120, message.optional_import_message.d)
+  test_case.assertEqual(126, message.optional_public_import_message.e)
+  test_case.assertEqual(127, message.optional_lazy_message.bb)
+
+  test_case.assertEqual(unittest_pb2.TestAllTypes.BAZ,
+                        message.optional_nested_enum)
+  test_case.assertEqual(unittest_pb2.FOREIGN_BAZ,
+                        message.optional_foreign_enum)
+  test_case.assertEqual(unittest_import_pb2.IMPORT_BAZ,
+                        message.optional_import_enum)
+
+  # -----------------------------------------------------------------
+
+  test_case.assertEqual(2, len(message.repeated_int32))
+  test_case.assertEqual(2, len(message.repeated_int64))
+  test_case.assertEqual(2, len(message.repeated_uint32))
+  test_case.assertEqual(2, len(message.repeated_uint64))
+  test_case.assertEqual(2, len(message.repeated_sint32))
+  test_case.assertEqual(2, len(message.repeated_sint64))
+  test_case.assertEqual(2, len(message.repeated_fixed32))
+  test_case.assertEqual(2, len(message.repeated_fixed64))
+  test_case.assertEqual(2, len(message.repeated_sfixed32))
+  test_case.assertEqual(2, len(message.repeated_sfixed64))
+  test_case.assertEqual(2, len(message.repeated_float))
+  test_case.assertEqual(2, len(message.repeated_double))
+  test_case.assertEqual(2, len(message.repeated_bool))
+  test_case.assertEqual(2, len(message.repeated_string))
+  test_case.assertEqual(2, len(message.repeated_bytes))
+
+  test_case.assertEqual(2, len(message.repeatedgroup))
+  test_case.assertEqual(2, len(message.repeated_nested_message))
+  test_case.assertEqual(2, len(message.repeated_foreign_message))
+  test_case.assertEqual(2, len(message.repeated_import_message))
+  test_case.assertEqual(2, len(message.repeated_nested_enum))
+  test_case.assertEqual(2, len(message.repeated_foreign_enum))
+  test_case.assertEqual(2, len(message.repeated_import_enum))
+
+  test_case.assertEqual(2, len(message.repeated_string_piece))
+  test_case.assertEqual(2, len(message.repeated_cord))
+
+  test_case.assertEqual(201, message.repeated_int32[0])
+  test_case.assertEqual(202, message.repeated_int64[0])
+  test_case.assertEqual(203, message.repeated_uint32[0])
+  test_case.assertEqual(204, message.repeated_uint64[0])
+  test_case.assertEqual(205, message.repeated_sint32[0])
+  test_case.assertEqual(206, message.repeated_sint64[0])
+  test_case.assertEqual(207, message.repeated_fixed32[0])
+  test_case.assertEqual(208, message.repeated_fixed64[0])
+  test_case.assertEqual(209, message.repeated_sfixed32[0])
+  test_case.assertEqual(210, message.repeated_sfixed64[0])
+  test_case.assertEqual(211, message.repeated_float[0])
+  test_case.assertEqual(212, message.repeated_double[0])
+  test_case.assertEqual(True, message.repeated_bool[0])
+  test_case.assertEqual('215', message.repeated_string[0])
+  test_case.assertEqual('216', message.repeated_bytes[0])
+
+  test_case.assertEqual(217, message.repeatedgroup[0].a)
+  test_case.assertEqual(218, message.repeated_nested_message[0].bb)
+  test_case.assertEqual(219, message.repeated_foreign_message[0].c)
+  test_case.assertEqual(220, message.repeated_import_message[0].d)
+  test_case.assertEqual(227, message.repeated_lazy_message[0].bb)
+
+  test_case.assertEqual(unittest_pb2.TestAllTypes.BAR,
+                        message.repeated_nested_enum[0])
+  test_case.assertEqual(unittest_pb2.FOREIGN_BAR,
+                        message.repeated_foreign_enum[0])
+  test_case.assertEqual(unittest_import_pb2.IMPORT_BAR,
+                        message.repeated_import_enum[0])
+
+  test_case.assertEqual(301, message.repeated_int32[1])
+  test_case.assertEqual(302, message.repeated_int64[1])
+  test_case.assertEqual(303, message.repeated_uint32[1])
+  test_case.assertEqual(304, message.repeated_uint64[1])
+  test_case.assertEqual(305, message.repeated_sint32[1])
+  test_case.assertEqual(306, message.repeated_sint64[1])
+  test_case.assertEqual(307, message.repeated_fixed32[1])
+  test_case.assertEqual(308, message.repeated_fixed64[1])
+  test_case.assertEqual(309, message.repeated_sfixed32[1])
+  test_case.assertEqual(310, message.repeated_sfixed64[1])
+  test_case.assertEqual(311, message.repeated_float[1])
+  test_case.assertEqual(312, message.repeated_double[1])
+  test_case.assertEqual(False, message.repeated_bool[1])
+  test_case.assertEqual('315', message.repeated_string[1])
+  test_case.assertEqual('316', message.repeated_bytes[1])
+
+  test_case.assertEqual(317, message.repeatedgroup[1].a)
+  test_case.assertEqual(318, message.repeated_nested_message[1].bb)
+  test_case.assertEqual(319, message.repeated_foreign_message[1].c)
+  test_case.assertEqual(320, message.repeated_import_message[1].d)
+  test_case.assertEqual(327, message.repeated_lazy_message[1].bb)
+
+  test_case.assertEqual(unittest_pb2.TestAllTypes.BAZ,
+                        message.repeated_nested_enum[1])
+  test_case.assertEqual(unittest_pb2.FOREIGN_BAZ,
+                        message.repeated_foreign_enum[1])
+  test_case.assertEqual(unittest_import_pb2.IMPORT_BAZ,
+                        message.repeated_import_enum[1])
+
+  # -----------------------------------------------------------------
+
+  test_case.assertTrue(message.HasField('default_int32'))
+  test_case.assertTrue(message.HasField('default_int64'))
+  test_case.assertTrue(message.HasField('default_uint32'))
+  test_case.assertTrue(message.HasField('default_uint64'))
+  test_case.assertTrue(message.HasField('default_sint32'))
+  test_case.assertTrue(message.HasField('default_sint64'))
+  test_case.assertTrue(message.HasField('default_fixed32'))
+  test_case.assertTrue(message.HasField('default_fixed64'))
+  test_case.assertTrue(message.HasField('default_sfixed32'))
+  test_case.assertTrue(message.HasField('default_sfixed64'))
+  test_case.assertTrue(message.HasField('default_float'))
+  test_case.assertTrue(message.HasField('default_double'))
+  test_case.assertTrue(message.HasField('default_bool'))
+  test_case.assertTrue(message.HasField('default_string'))
+  test_case.assertTrue(message.HasField('default_bytes'))
+
+  test_case.assertTrue(message.HasField('default_nested_enum'))
+  test_case.assertTrue(message.HasField('default_foreign_enum'))
+  test_case.assertTrue(message.HasField('default_import_enum'))
+
+  test_case.assertEqual(401, message.default_int32)
+  test_case.assertEqual(402, message.default_int64)
+  test_case.assertEqual(403, message.default_uint32)
+  test_case.assertEqual(404, message.default_uint64)
+  test_case.assertEqual(405, message.default_sint32)
+  test_case.assertEqual(406, message.default_sint64)
+  test_case.assertEqual(407, message.default_fixed32)
+  test_case.assertEqual(408, message.default_fixed64)
+  test_case.assertEqual(409, message.default_sfixed32)
+  test_case.assertEqual(410, message.default_sfixed64)
+  test_case.assertEqual(411, message.default_float)
+  test_case.assertEqual(412, message.default_double)
+  test_case.assertEqual(False, message.default_bool)
+  test_case.assertEqual('415', message.default_string)
+  test_case.assertEqual('416', message.default_bytes)
+
+  test_case.assertEqual(unittest_pb2.TestAllTypes.FOO,
+                        message.default_nested_enum)
+  test_case.assertEqual(unittest_pb2.FOREIGN_FOO,
+                        message.default_foreign_enum)
+  test_case.assertEqual(unittest_import_pb2.IMPORT_FOO,
+                        message.default_import_enum)
+
+def GoldenFile(filename):
+  """Finds the given golden file and returns a file object representing it."""
+
+  # Search up the directory tree looking for the C++ protobuf source code.
+  path = '.'
+  while os.path.exists(path):
+    if os.path.exists(os.path.join(path, 'tests/google/protobuf/internal')):
+      # Found it.  Load the golden file from the testdata directory.
+      full_path = os.path.join(path, 'tests/google/protobuf/internal', filename)
+      return open(full_path, 'rb')
+    path = os.path.join(path, '..')
+
+  raise RuntimeError(
+    'Could not find golden files.  This test must be run from within the '
+    'protobuf source package so that it can read test data files from the '
+    'C++ source tree.')
+
+
+def SetAllPackedFields(message):
+  """Sets every field in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestPackedTypes instance.
+  """
+  message.packed_int32.extend([601, 701])
+  message.packed_int64.extend([602, 702])
+  message.packed_uint32.extend([603, 703])
+  message.packed_uint64.extend([604, 704])
+  message.packed_sint32.extend([605, 705])
+  message.packed_sint64.extend([606, 706])
+  message.packed_fixed32.extend([607, 707])
+  message.packed_fixed64.extend([608, 708])
+  message.packed_sfixed32.extend([609, 709])
+  message.packed_sfixed64.extend([610, 710])
+  message.packed_float.extend([611.0, 711.0])
+  message.packed_double.extend([612.0, 712.0])
+  message.packed_bool.extend([True, False])
+  message.packed_enum.extend([unittest_pb2.FOREIGN_BAR,
+                              unittest_pb2.FOREIGN_BAZ])
+
+
+def SetAllPackedExtensions(message):
+  """Sets every extension in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestPackedExtensions instance.
+  """
+  extensions = message.Extensions
+  pb2 = unittest_pb2
+
+  extensions[pb2.packed_int32_extension].extend([601, 701])
+  extensions[pb2.packed_int64_extension].extend([602, 702])
+  extensions[pb2.packed_uint32_extension].extend([603, 703])
+  extensions[pb2.packed_uint64_extension].extend([604, 704])
+  extensions[pb2.packed_sint32_extension].extend([605, 705])
+  extensions[pb2.packed_sint64_extension].extend([606, 706])
+  extensions[pb2.packed_fixed32_extension].extend([607, 707])
+  extensions[pb2.packed_fixed64_extension].extend([608, 708])
+  extensions[pb2.packed_sfixed32_extension].extend([609, 709])
+  extensions[pb2.packed_sfixed64_extension].extend([610, 710])
+  extensions[pb2.packed_float_extension].extend([611.0, 711.0])
+  extensions[pb2.packed_double_extension].extend([612.0, 712.0])
+  extensions[pb2.packed_bool_extension].extend([True, False])
+  extensions[pb2.packed_enum_extension].extend([unittest_pb2.FOREIGN_BAR,
+                                                unittest_pb2.FOREIGN_BAZ])
+
+
+def SetAllUnpackedFields(message):
+  """Sets every field in the message to a unique value.
+
+  Args:
+    message: A unittest_pb2.TestUnpackedTypes instance.
+  """
+  message.unpacked_int32.extend([601, 701])
+  message.unpacked_int64.extend([602, 702])
+  message.unpacked_uint32.extend([603, 703])
+  message.unpacked_uint64.extend([604, 704])
+  message.unpacked_sint32.extend([605, 705])
+  message.unpacked_sint64.extend([606, 706])
+  message.unpacked_fixed32.extend([607, 707])
+  message.unpacked_fixed64.extend([608, 708])
+  message.unpacked_sfixed32.extend([609, 709])
+  message.unpacked_sfixed64.extend([610, 710])
+  message.unpacked_float.extend([611.0, 711.0])
+  message.unpacked_double.extend([612.0, 712.0])
+  message.unpacked_bool.extend([True, False])
+  message.unpacked_enum.extend([unittest_pb2.FOREIGN_BAR,
+                                unittest_pb2.FOREIGN_BAZ])
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_test.py
new file mode 100755
index 0000000..8267cd2
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_test.py
@@ -0,0 +1,619 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Test for google.protobuf.text_format."""
+
+__author__ = 'kenton@google.com (Kenton Varda)'
+
+import difflib
+import re
+
+import unittest
+from google.protobuf import text_format
+from google.protobuf.internal import test_util
+from google.protobuf import unittest_pb2
+from google.protobuf import unittest_mset_pb2
+
+
+class TextFormatTest(unittest.TestCase):
+  def ReadGolden(self, golden_filename):
+    f = test_util.GoldenFile(golden_filename)
+    golden_lines = f.readlines()
+    f.close()
+    return golden_lines
+
+  def CompareToGoldenFile(self, text, golden_filename):
+    golden_lines = self.ReadGolden(golden_filename)
+    self.CompareToGoldenLines(text, golden_lines)
+
+  def CompareToGoldenText(self, text, golden_text):
+    self.CompareToGoldenLines(text, golden_text.splitlines(1))
+
+  def CompareToGoldenLines(self, text, golden_lines):
+    actual_lines = text.splitlines(1)
+    self.assertEqual(golden_lines, actual_lines,
+      "Text doesn't match golden.  Diff:\n" +
+      ''.join(difflib.ndiff(golden_lines, actual_lines)))
+
+  def testPrintAllFields(self):
+    message = unittest_pb2.TestAllTypes()
+    test_util.SetAllFields(message)
+    self.CompareToGoldenFile(
+      self.RemoveRedundantZeros(text_format.MessageToString(message)),
+      'text_format_unittest_data.txt')
+
+  def testPrintAllExtensions(self):
+    message = unittest_pb2.TestAllExtensions()
+    test_util.SetAllExtensions(message)
+    self.CompareToGoldenFile(
+      self.RemoveRedundantZeros(text_format.MessageToString(message)),
+      'text_format_unittest_extensions_data.txt')
+
+  def testPrintMessageSet(self):
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    self.CompareToGoldenText(text_format.MessageToString(message),
+      'message_set {\n'
+      '  [protobuf_unittest.TestMessageSetExtension1] {\n'
+      '    i: 23\n'
+      '  }\n'
+      '  [protobuf_unittest.TestMessageSetExtension2] {\n'
+      '    str: \"foo\"\n'
+      '  }\n'
+      '}\n')
+
+  """
+  def testPrintBadEnumValue(self):
+    message = unittest_pb2.TestAllTypes()
+    message.optional_nested_enum = 100
+    message.optional_foreign_enum = 101
+    message.optional_import_enum = 102
+    self.CompareToGoldenText(
+        text_format.MessageToString(message),
+        'optional_nested_enum: 100\n'
+        'optional_foreign_enum: 101\n'
+        'optional_import_enum: 102\n')
+
+  def testPrintBadEnumValueExtensions(self):
+    message = unittest_pb2.TestAllExtensions()
+    message.Extensions[unittest_pb2.optional_nested_enum_extension] = 100
+    message.Extensions[unittest_pb2.optional_foreign_enum_extension] = 101
+    message.Extensions[unittest_pb2.optional_import_enum_extension] = 102
+    self.CompareToGoldenText(
+        text_format.MessageToString(message),
+        '[protobuf_unittest.optional_nested_enum_extension]: 100\n'
+        '[protobuf_unittest.optional_foreign_enum_extension]: 101\n'
+        '[protobuf_unittest.optional_import_enum_extension]: 102\n')
+  """
+
+  def testPrintExotic(self):
+    message = unittest_pb2.TestAllTypes()
+    message.repeated_int64.append(-9223372036854775808)
+    message.repeated_uint64.append(18446744073709551615)
+    message.repeated_double.append(123.456)
+    message.repeated_double.append(1.23e22)
+    message.repeated_double.append(1.23e-18)
+    message.repeated_string.append('\000\001\a\b\f\n\r\t\v\\\'"')
+    message.repeated_string.append(u'\u00fc\ua71f')
+    self.CompareToGoldenText(
+      self.RemoveRedundantZeros(text_format.MessageToString(message)),
+      'repeated_int64: -9223372036854775808\n'
+      'repeated_uint64: 18446744073709551615\n'
+      'repeated_double: 123.456\n'
+      'repeated_double: 1.23e+22\n'
+      'repeated_double: 1.23e-18\n'
+      'repeated_string: '
+        '"\\000\\001\\007\\010\\014\\n\\r\\t\\013\\\\\\\'\\""\n'
+      'repeated_string: "\\303\\274\\352\\234\\237"\n')
+
+  def testPrintNestedMessageAsOneLine(self):
+    message = unittest_pb2.TestAllTypes()
+    msg = message.repeated_nested_message.add()
+    msg.bb = 42;
+    self.CompareToGoldenText(
+        text_format.MessageToString(message, as_one_line=True),
+        'repeated_nested_message { bb: 42 }')
+
+  def testPrintRepeatedFieldsAsOneLine(self):
+    message = unittest_pb2.TestAllTypes()
+    message.repeated_int32.append(1)
+    message.repeated_int32.append(1)
+    message.repeated_int32.append(3)
+    message.repeated_string.append("Google")
+    message.repeated_string.append("Zurich")
+    self.CompareToGoldenText(
+        text_format.MessageToString(message, as_one_line=True),
+        'repeated_int32: 1 repeated_int32: 1 repeated_int32: 3 '
+        'repeated_string: "Google" repeated_string: "Zurich"')
+
+  def testPrintNestedNewLineInStringAsOneLine(self):
+    message = unittest_pb2.TestAllTypes()
+    message.optional_string = "a\nnew\nline"
+    self.CompareToGoldenText(
+        text_format.MessageToString(message, as_one_line=True),
+        'optional_string: "a\\nnew\\nline"')
+
+  def testPrintMessageSetAsOneLine(self):
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    self.CompareToGoldenText(
+        text_format.MessageToString(message, as_one_line=True),
+        'message_set {'
+        ' [protobuf_unittest.TestMessageSetExtension1] {'
+        ' i: 23'
+        ' }'
+        ' [protobuf_unittest.TestMessageSetExtension2] {'
+        ' str: \"foo\"'
+        ' }'
+        ' }')
+
+  def testPrintExoticAsOneLine(self):
+    message = unittest_pb2.TestAllTypes()
+    message.repeated_int64.append(-9223372036854775808)
+    message.repeated_uint64.append(18446744073709551615)
+    message.repeated_double.append(123.456)
+    message.repeated_double.append(1.23e22)
+    message.repeated_double.append(1.23e-18)
+    message.repeated_string.append('\000\001\a\b\f\n\r\t\v\\\'"')
+    message.repeated_string.append(u'\u00fc\ua71f')
+    self.CompareToGoldenText(
+      self.RemoveRedundantZeros(
+          text_format.MessageToString(message, as_one_line=True)),
+      'repeated_int64: -9223372036854775808'
+      ' repeated_uint64: 18446744073709551615'
+      ' repeated_double: 123.456'
+      ' repeated_double: 1.23e+22'
+      ' repeated_double: 1.23e-18'
+      ' repeated_string: '
+      '"\\000\\001\\007\\010\\014\\n\\r\\t\\013\\\\\\\'\\""'
+      ' repeated_string: "\\303\\274\\352\\234\\237"')
+
+  def testRoundTripExoticAsOneLine(self):
+    message = unittest_pb2.TestAllTypes()
+    message.repeated_int64.append(-9223372036854775808)
+    message.repeated_uint64.append(18446744073709551615)
+    message.repeated_double.append(123.456)
+    message.repeated_double.append(1.23e22)
+    message.repeated_double.append(1.23e-18)
+    message.repeated_string.append('\000\001\a\b\f\n\r\t\v\\\'"')
+    message.repeated_string.append(u'\u00fc\ua71f')
+
+    # Test as_utf8 = False.
+    wire_text = text_format.MessageToString(
+        message, as_one_line=True, as_utf8=False)
+    parsed_message = unittest_pb2.TestAllTypes()
+    text_format.Merge(wire_text, parsed_message)
+    self.assertEquals(message, parsed_message)
+
+    # Test as_utf8 = True.
+    wire_text = text_format.MessageToString(
+        message, as_one_line=True, as_utf8=True)
+    parsed_message = unittest_pb2.TestAllTypes()
+    text_format.Merge(wire_text, parsed_message)
+    self.assertEquals(message, parsed_message)
+
+  def testPrintRawUtf8String(self):
+    message = unittest_pb2.TestAllTypes()
+    message.repeated_string.append(u'\u00fc\ua71f')
+    text = text_format.MessageToString(message, as_utf8 = True)
+    self.CompareToGoldenText(text, 'repeated_string: "\303\274\352\234\237"\n')
+    parsed_message = unittest_pb2.TestAllTypes()
+    text_format.Merge(text, parsed_message)
+    self.assertEquals(message, parsed_message)
+
+  def testMessageToString(self):
+    message = unittest_pb2.ForeignMessage()
+    message.c = 123
+    self.assertEqual('c: 123\n', str(message))
+
+  def RemoveRedundantZeros(self, text):
+    # Some platforms print 1e+5 as 1e+005.  This is fine, but we need to remove
+    # these zeros in order to match the golden file.
+    text = text.replace('e+0','e+').replace('e+0','e+') \
+               .replace('e-0','e-').replace('e-0','e-')
+    # Floating point fields are printed with .0 suffix even if they are
+    # actualy integer numbers.
+    text = re.compile('\.0$', re.MULTILINE).sub('', text)
+    return text
+
+  def testMergeGolden(self):
+    golden_text = '\n'.join(self.ReadGolden('text_format_unittest_data.txt'))
+    parsed_message = unittest_pb2.TestAllTypes()
+    text_format.Merge(golden_text, parsed_message)
+
+    message = unittest_pb2.TestAllTypes()
+    test_util.SetAllFields(message)
+    self.assertEquals(message, parsed_message)
+
+  def testMergeGoldenExtensions(self):
+    golden_text = '\n'.join(self.ReadGolden(
+        'text_format_unittest_extensions_data.txt'))
+    parsed_message = unittest_pb2.TestAllExtensions()
+    text_format.Merge(golden_text, parsed_message)
+
+    message = unittest_pb2.TestAllExtensions()
+    test_util.SetAllExtensions(message)
+    self.assertEquals(message, parsed_message)
+
+  def testMergeAllFields(self):
+    message = unittest_pb2.TestAllTypes()
+    test_util.SetAllFields(message)
+    ascii_text = text_format.MessageToString(message)
+
+    parsed_message = unittest_pb2.TestAllTypes()
+    text_format.Merge(ascii_text, parsed_message)
+    self.assertEqual(message, parsed_message)
+    test_util.ExpectAllFieldsSet(self, message)
+
+  def testMergeAllExtensions(self):
+    message = unittest_pb2.TestAllExtensions()
+    test_util.SetAllExtensions(message)
+    ascii_text = text_format.MessageToString(message)
+
+    parsed_message = unittest_pb2.TestAllExtensions()
+    text_format.Merge(ascii_text, parsed_message)
+    self.assertEqual(message, parsed_message)
+
+  def testMergeMessageSet(self):
+    message = unittest_pb2.TestAllTypes()
+    text = ('repeated_uint64: 1\n'
+            'repeated_uint64: 2\n')
+    text_format.Merge(text, message)
+    self.assertEqual(1, message.repeated_uint64[0])
+    self.assertEqual(2, message.repeated_uint64[1])
+
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    text = ('message_set {\n'
+            '  [protobuf_unittest.TestMessageSetExtension1] {\n'
+            '    i: 23\n'
+            '  }\n'
+            '  [protobuf_unittest.TestMessageSetExtension2] {\n'
+            '    str: \"foo\"\n'
+            '  }\n'
+            '}\n')
+    text_format.Merge(text, message)
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    self.assertEquals(23, message.message_set.Extensions[ext1].i)
+    self.assertEquals('foo', message.message_set.Extensions[ext2].str)
+
+  def testMergeExotic(self):
+    message = unittest_pb2.TestAllTypes()
+    text = ('repeated_int64: -9223372036854775808\n'
+            'repeated_uint64: 18446744073709551615\n'
+            'repeated_double: 123.456\n'
+            'repeated_double: 1.23e+22\n'
+            'repeated_double: 1.23e-18\n'
+            'repeated_string: \n'
+            '"\\000\\001\\007\\010\\014\\n\\r\\t\\013\\\\\\\'\\""\n'
+            'repeated_string: "foo" \'corge\' "grault"\n'
+            'repeated_string: "\\303\\274\\352\\234\\237"\n'
+            'repeated_string: "\\xc3\\xbc"\n'
+            'repeated_string: "\xc3\xbc"\n')
+    text_format.Merge(text, message)
+
+    self.assertEqual(-9223372036854775808, message.repeated_int64[0])
+    self.assertEqual(18446744073709551615, message.repeated_uint64[0])
+    self.assertEqual(123.456, message.repeated_double[0])
+    self.assertEqual(1.23e22, message.repeated_double[1])
+    self.assertEqual(1.23e-18, message.repeated_double[2])
+    self.assertEqual(
+        '\000\001\a\b\f\n\r\t\v\\\'"', message.repeated_string[0])
+    self.assertEqual('foocorgegrault', message.repeated_string[1])
+    self.assertEqual(u'\u00fc\ua71f', message.repeated_string[2])
+    self.assertEqual(u'\u00fc', message.repeated_string[3])
+
+  def testMergeEmptyText(self):
+    message = unittest_pb2.TestAllTypes()
+    text = ''
+    text_format.Merge(text, message)
+    self.assertEquals(unittest_pb2.TestAllTypes(), message)
+
+  def testMergeInvalidUtf8(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'repeated_string: "\\xc3\\xc3"'
+    self.assertRaises(text_format.ParseError, text_format.Merge, text, message)
+
+  def testMergeSingleWord(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'foo'
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:1 : Message type "protobuf_unittest.TestAllTypes" has no field named '
+         '"foo".'),
+        text_format.Merge, text, message)
+
+  def testMergeUnknownField(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'unknown_field: 8\n'
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:1 : Message type "protobuf_unittest.TestAllTypes" has no field named '
+         '"unknown_field".'),
+        text_format.Merge, text, message)
+
+  def testMergeBadExtension(self):
+    message = unittest_pb2.TestAllExtensions()
+    text = '[unknown_extension]: 8\n'
+    self.assertRaises(
+        text_format.ParseError,
+        text_format.Merge, text, message)
+    message = unittest_pb2.TestAllTypes()
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:2 : Message type "protobuf_unittest.TestAllTypes" does not have '
+         'extensions.'),
+        text_format.Merge, text, message)
+
+  def testMergeGroupNotClosed(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'RepeatedGroup: <'
+    self.assertRaisesWithMessage(
+        text_format.ParseError, '1:16 : Expected ">".',
+        text_format.Merge, text, message)
+
+    text = 'RepeatedGroup: {'
+    self.assertRaisesWithMessage(
+        text_format.ParseError, '1:16 : Expected "}".',
+        text_format.Merge, text, message)
+
+  def testMergeEmptyGroup(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'OptionalGroup: {}'
+    text_format.Merge(text, message)
+    self.assertTrue(message.HasField('optionalgroup'))
+
+    message.Clear()
+
+    message = unittest_pb2.TestAllTypes()
+    text = 'OptionalGroup: <>'
+    text_format.Merge(text, message)
+    self.assertTrue(message.HasField('optionalgroup'))
+
+  def testMergeBadEnumValue(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'optional_nested_enum: BARR'
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:23 : Enum type "protobuf_unittest.TestAllTypes.NestedEnum" '
+         'has no value named BARR.'),
+        text_format.Merge, text, message)
+
+    message = unittest_pb2.TestAllTypes()
+    text = 'optional_nested_enum: 100'
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:23 : Enum type "protobuf_unittest.TestAllTypes.NestedEnum" '
+         'has no value with number 100.'),
+        text_format.Merge, text, message)
+
+  def testMergeBadIntValue(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'optional_int32: bork'
+    self.assertRaisesWithMessage(
+        text_format.ParseError,
+        ('1:17 : Couldn\'t parse integer: bork'),
+        text_format.Merge, text, message)
+
+  def testMergeStringFieldUnescape(self):
+    message = unittest_pb2.TestAllTypes()
+    text = r'''repeated_string: "\xf\x62"
+               repeated_string: "\\xf\\x62"
+               repeated_string: "\\\xf\\\x62"
+               repeated_string: "\\\\xf\\\\x62"
+               repeated_string: "\\\\\xf\\\\\x62"
+               repeated_string: "\x5cx20"'''
+    text_format.Merge(text, message)
+
+    SLASH = '\\'
+    self.assertEqual('\x0fb', message.repeated_string[0])
+    self.assertEqual(SLASH + 'xf' + SLASH + 'x62', message.repeated_string[1])
+    self.assertEqual(SLASH + '\x0f' + SLASH + 'b', message.repeated_string[2])
+    self.assertEqual(SLASH + SLASH + 'xf' + SLASH + SLASH + 'x62',
+                     message.repeated_string[3])
+    self.assertEqual(SLASH + SLASH + '\x0f' + SLASH + SLASH + 'b',
+                     message.repeated_string[4])
+    self.assertEqual(SLASH + 'x20', message.repeated_string[5])
+
+  def assertRaisesWithMessage(self, e_class, e, func, *args, **kwargs):
+    """Same as assertRaises, but also compares the exception message."""
+    if hasattr(e_class, '__name__'):
+      exc_name = e_class.__name__
+    else:
+      exc_name = str(e_class)
+
+    try:
+      func(*args, **kwargs)
+    except e_class as expr:
+      if str(expr) != e:
+        msg = '%s raised, but with wrong message: "%s" instead of "%s"'
+        raise self.failureException(msg % (exc_name,
+                                           str(expr).encode('string_escape'),
+                                           e.encode('string_escape')))
+      return
+    else:
+      raise self.failureException('%s not raised' % exc_name)
+
+
+class TokenizerTest(unittest.TestCase):
+  """
+  def testSimpleTokenCases(self):
+    text = ('identifier1:"string1"\n     \n\n'
+            'identifier2 : \n \n123  \n  identifier3 :\'string\'\n'
+            'identifiER_4 : 1.1e+2 ID5:-0.23 ID6:\'aaaa\\\'bbbb\'\n'
+            'ID7 : "aa\\"bb"\n\n\n\n ID8: {A:inf B:-inf C:true D:false}\n'
+            'ID9: 22 ID10: -111111111111111111 ID11: -22\n'
+            'ID12: 2222222222222222222 ID13: 1.23456f ID14: 1.2e+2f '
+            'false_bool:  0 true_BOOL:t \n true_bool1:  1 false_BOOL1:f ' )
+    tokenizer = text_format._Tokenizer(text)
+    methods = [(tokenizer.ConsumeIdentifier, 'identifier1'),
+               ':',
+               (tokenizer.ConsumeString, 'string1'),
+               (tokenizer.ConsumeIdentifier, 'identifier2'),
+               ':',
+               (tokenizer.ConsumeInt32, 123),
+               (tokenizer.ConsumeIdentifier, 'identifier3'),
+               ':',
+               (tokenizer.ConsumeString, 'string'),
+               (tokenizer.ConsumeIdentifier, 'identifiER_4'),
+               ':',
+               (tokenizer.ConsumeFloat, 1.1e+2),
+               (tokenizer.ConsumeIdentifier, 'ID5'),
+               ':',
+               (tokenizer.ConsumeFloat, -0.23),
+               (tokenizer.ConsumeIdentifier, 'ID6'),
+               ':',
+               (tokenizer.ConsumeString, 'aaaa\'bbbb'),
+               (tokenizer.ConsumeIdentifier, 'ID7'),
+               ':',
+               (tokenizer.ConsumeString, 'aa\"bb'),
+               (tokenizer.ConsumeIdentifier, 'ID8'),
+               ':',
+               '{',
+               (tokenizer.ConsumeIdentifier, 'A'),
+               ':',
+               (tokenizer.ConsumeFloat, float('inf')),
+               (tokenizer.ConsumeIdentifier, 'B'),
+               ':',
+               (tokenizer.ConsumeFloat, -float('inf')),
+               (tokenizer.ConsumeIdentifier, 'C'),
+               ':',
+               (tokenizer.ConsumeBool, True),
+               (tokenizer.ConsumeIdentifier, 'D'),
+               ':',
+               (tokenizer.ConsumeBool, False),
+               '}',
+               (tokenizer.ConsumeIdentifier, 'ID9'),
+               ':',
+               (tokenizer.ConsumeUint32, 22),
+               (tokenizer.ConsumeIdentifier, 'ID10'),
+               ':',
+               (tokenizer.ConsumeInt64, -111111111111111111),
+               (tokenizer.ConsumeIdentifier, 'ID11'),
+               ':',
+               (tokenizer.ConsumeInt32, -22),
+               (tokenizer.ConsumeIdentifier, 'ID12'),
+               ':',
+               (tokenizer.ConsumeUint64, 2222222222222222222),
+               (tokenizer.ConsumeIdentifier, 'ID13'),
+               ':',
+               (tokenizer.ConsumeFloat, 1.23456),
+               (tokenizer.ConsumeIdentifier, 'ID14'),
+               ':',
+               (tokenizer.ConsumeFloat, 1.2e+2),
+               (tokenizer.ConsumeIdentifier, 'false_bool'),
+               ':',
+               (tokenizer.ConsumeBool, False),
+               (tokenizer.ConsumeIdentifier, 'true_BOOL'),
+               ':',
+               (tokenizer.ConsumeBool, True),
+               (tokenizer.ConsumeIdentifier, 'true_bool1'),
+               ':',
+               (tokenizer.ConsumeBool, True),
+               (tokenizer.ConsumeIdentifier, 'false_BOOL1'),
+               ':',
+               (tokenizer.ConsumeBool, False)]
+
+    i = 0
+    while not tokenizer.AtEnd():
+      m = methods[i]
+      if type(m) == str:
+        token = tokenizer.token
+        self.assertEqual(token, m)
+        tokenizer.NextToken()
+      else:
+        self.assertEqual(m[1], m[0]())
+      i += 1
+
+  def testConsumeIntegers(self):
+    # This test only tests the failures in the integer parsing methods as well
+    # as the '0' special cases.
+    int64_max = (1 << 63) - 1
+    uint32_max = (1 << 32) - 1
+    text = '-1 %d %d' % (uint32_max + 1, int64_max + 1)
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint32)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint64)
+    self.assertEqual(-1, tokenizer.ConsumeInt32())
+
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint32)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeInt32)
+    self.assertEqual(uint32_max + 1, tokenizer.ConsumeInt64())
+
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeInt64)
+    self.assertEqual(int64_max + 1, tokenizer.ConsumeUint64())
+    self.assertTrue(tokenizer.AtEnd())
+
+    text = '-0 -0 0 0'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertEqual(0, tokenizer.ConsumeUint32())
+    self.assertEqual(0, tokenizer.ConsumeUint64())
+    self.assertEqual(0, tokenizer.ConsumeUint32())
+    self.assertEqual(0, tokenizer.ConsumeUint64())
+    self.assertTrue(tokenizer.AtEnd())
+    """
+
+  def testConsumeByteString(self):
+    text = '"string1\''
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
+
+    text = 'string1"'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
+
+    text = '\n"\\xt"'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
+
+    text = '\n"\\"'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
+
+    text = '\n"\\x"'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
+
+  def testConsumeBool(self):
+    text = 'not-a-bool'
+    tokenizer = text_format._Tokenizer(text)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeBool)
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_data.txt b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_data.txt
new file mode 100644
index 0000000..bbe5882
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_data.txt
@@ -0,0 +1,128 @@
+optional_int32: 101
+optional_int64: 102
+optional_uint32: 103
+optional_uint64: 104
+optional_sint32: 105
+optional_sint64: 106
+optional_fixed32: 107
+optional_fixed64: 108
+optional_sfixed32: 109
+optional_sfixed64: 110
+optional_float: 111
+optional_double: 112
+optional_bool: true
+optional_string: "115"
+optional_bytes: "116"
+OptionalGroup {
+  a: 117
+}
+optional_nested_message {
+  bb: 118
+}
+optional_foreign_message {
+  c: 119
+}
+optional_import_message {
+  d: 120
+}
+optional_nested_enum: BAZ
+optional_foreign_enum: FOREIGN_BAZ
+optional_import_enum: IMPORT_BAZ
+optional_string_piece: "124"
+optional_cord: "125"
+optional_public_import_message {
+  e: 126
+}
+optional_lazy_message {
+  bb: 127
+}
+repeated_int32: 201
+repeated_int32: 301
+repeated_int64: 202
+repeated_int64: 302
+repeated_uint32: 203
+repeated_uint32: 303
+repeated_uint64: 204
+repeated_uint64: 304
+repeated_sint32: 205
+repeated_sint32: 305
+repeated_sint64: 206
+repeated_sint64: 306
+repeated_fixed32: 207
+repeated_fixed32: 307
+repeated_fixed64: 208
+repeated_fixed64: 308
+repeated_sfixed32: 209
+repeated_sfixed32: 309
+repeated_sfixed64: 210
+repeated_sfixed64: 310
+repeated_float: 211
+repeated_float: 311
+repeated_double: 212
+repeated_double: 312
+repeated_bool: true
+repeated_bool: false
+repeated_string: "215"
+repeated_string: "315"
+repeated_bytes: "216"
+repeated_bytes: "316"
+RepeatedGroup {
+  a: 217
+}
+RepeatedGroup {
+  a: 317
+}
+repeated_nested_message {
+  bb: 218
+}
+repeated_nested_message {
+  bb: 318
+}
+repeated_foreign_message {
+  c: 219
+}
+repeated_foreign_message {
+  c: 319
+}
+repeated_import_message {
+  d: 220
+}
+repeated_import_message {
+  d: 320
+}
+repeated_nested_enum: BAR
+repeated_nested_enum: BAZ
+repeated_foreign_enum: FOREIGN_BAR
+repeated_foreign_enum: FOREIGN_BAZ
+repeated_import_enum: IMPORT_BAR
+repeated_import_enum: IMPORT_BAZ
+repeated_string_piece: "224"
+repeated_string_piece: "324"
+repeated_cord: "225"
+repeated_cord: "325"
+repeated_lazy_message {
+  bb: 227
+}
+repeated_lazy_message {
+  bb: 327
+}
+default_int32: 401
+default_int64: 402
+default_uint32: 403
+default_uint64: 404
+default_sint32: 405
+default_sint64: 406
+default_fixed32: 407
+default_fixed64: 408
+default_sfixed32: 409
+default_sfixed64: 410
+default_float: 411
+default_double: 412
+default_bool: false
+default_string: "415"
+default_bytes: "416"
+default_nested_enum: FOO
+default_foreign_enum: FOREIGN_FOO
+default_import_enum: IMPORT_FOO
+default_string_piece: "424"
+default_cord: "425"
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_extensions_data.txt b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_extensions_data.txt
new file mode 100644
index 0000000..0a217f0
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/text_format_unittest_extensions_data.txt
@@ -0,0 +1,128 @@
+[protobuf_unittest.optional_int32_extension]: 101
+[protobuf_unittest.optional_int64_extension]: 102
+[protobuf_unittest.optional_uint32_extension]: 103
+[protobuf_unittest.optional_uint64_extension]: 104
+[protobuf_unittest.optional_sint32_extension]: 105
+[protobuf_unittest.optional_sint64_extension]: 106
+[protobuf_unittest.optional_fixed32_extension]: 107
+[protobuf_unittest.optional_fixed64_extension]: 108
+[protobuf_unittest.optional_sfixed32_extension]: 109
+[protobuf_unittest.optional_sfixed64_extension]: 110
+[protobuf_unittest.optional_float_extension]: 111
+[protobuf_unittest.optional_double_extension]: 112
+[protobuf_unittest.optional_bool_extension]: true
+[protobuf_unittest.optional_string_extension]: "115"
+[protobuf_unittest.optional_bytes_extension]: "116"
+[protobuf_unittest.optionalgroup_extension] {
+  a: 117
+}
+[protobuf_unittest.optional_nested_message_extension] {
+  bb: 118
+}
+[protobuf_unittest.optional_foreign_message_extension] {
+  c: 119
+}
+[protobuf_unittest.optional_import_message_extension] {
+  d: 120
+}
+[protobuf_unittest.optional_nested_enum_extension]: BAZ
+[protobuf_unittest.optional_foreign_enum_extension]: FOREIGN_BAZ
+[protobuf_unittest.optional_import_enum_extension]: IMPORT_BAZ
+[protobuf_unittest.optional_string_piece_extension]: "124"
+[protobuf_unittest.optional_cord_extension]: "125"
+[protobuf_unittest.optional_public_import_message_extension] {
+  e: 126
+}
+[protobuf_unittest.optional_lazy_message_extension] {
+  bb: 127
+}
+[protobuf_unittest.repeated_int32_extension]: 201
+[protobuf_unittest.repeated_int32_extension]: 301
+[protobuf_unittest.repeated_int64_extension]: 202
+[protobuf_unittest.repeated_int64_extension]: 302
+[protobuf_unittest.repeated_uint32_extension]: 203
+[protobuf_unittest.repeated_uint32_extension]: 303
+[protobuf_unittest.repeated_uint64_extension]: 204
+[protobuf_unittest.repeated_uint64_extension]: 304
+[protobuf_unittest.repeated_sint32_extension]: 205
+[protobuf_unittest.repeated_sint32_extension]: 305
+[protobuf_unittest.repeated_sint64_extension]: 206
+[protobuf_unittest.repeated_sint64_extension]: 306
+[protobuf_unittest.repeated_fixed32_extension]: 207
+[protobuf_unittest.repeated_fixed32_extension]: 307
+[protobuf_unittest.repeated_fixed64_extension]: 208
+[protobuf_unittest.repeated_fixed64_extension]: 308
+[protobuf_unittest.repeated_sfixed32_extension]: 209
+[protobuf_unittest.repeated_sfixed32_extension]: 309
+[protobuf_unittest.repeated_sfixed64_extension]: 210
+[protobuf_unittest.repeated_sfixed64_extension]: 310
+[protobuf_unittest.repeated_float_extension]: 211
+[protobuf_unittest.repeated_float_extension]: 311
+[protobuf_unittest.repeated_double_extension]: 212
+[protobuf_unittest.repeated_double_extension]: 312
+[protobuf_unittest.repeated_bool_extension]: true
+[protobuf_unittest.repeated_bool_extension]: false
+[protobuf_unittest.repeated_string_extension]: "215"
+[protobuf_unittest.repeated_string_extension]: "315"
+[protobuf_unittest.repeated_bytes_extension]: "216"
+[protobuf_unittest.repeated_bytes_extension]: "316"
+[protobuf_unittest.repeatedgroup_extension] {
+  a: 217
+}
+[protobuf_unittest.repeatedgroup_extension] {
+  a: 317
+}
+[protobuf_unittest.repeated_nested_message_extension] {
+  bb: 218
+}
+[protobuf_unittest.repeated_nested_message_extension] {
+  bb: 318
+}
+[protobuf_unittest.repeated_foreign_message_extension] {
+  c: 219
+}
+[protobuf_unittest.repeated_foreign_message_extension] {
+  c: 319
+}
+[protobuf_unittest.repeated_import_message_extension] {
+  d: 220
+}
+[protobuf_unittest.repeated_import_message_extension] {
+  d: 320
+}
+[protobuf_unittest.repeated_nested_enum_extension]: BAR
+[protobuf_unittest.repeated_nested_enum_extension]: BAZ
+[protobuf_unittest.repeated_foreign_enum_extension]: FOREIGN_BAR
+[protobuf_unittest.repeated_foreign_enum_extension]: FOREIGN_BAZ
+[protobuf_unittest.repeated_import_enum_extension]: IMPORT_BAR
+[protobuf_unittest.repeated_import_enum_extension]: IMPORT_BAZ
+[protobuf_unittest.repeated_string_piece_extension]: "224"
+[protobuf_unittest.repeated_string_piece_extension]: "324"
+[protobuf_unittest.repeated_cord_extension]: "225"
+[protobuf_unittest.repeated_cord_extension]: "325"
+[protobuf_unittest.repeated_lazy_message_extension] {
+  bb: 227
+}
+[protobuf_unittest.repeated_lazy_message_extension] {
+  bb: 327
+}
+[protobuf_unittest.default_int32_extension]: 401
+[protobuf_unittest.default_int64_extension]: 402
+[protobuf_unittest.default_uint32_extension]: 403
+[protobuf_unittest.default_uint64_extension]: 404
+[protobuf_unittest.default_sint32_extension]: 405
+[protobuf_unittest.default_sint64_extension]: 406
+[protobuf_unittest.default_fixed32_extension]: 407
+[protobuf_unittest.default_fixed64_extension]: 408
+[protobuf_unittest.default_sfixed32_extension]: 409
+[protobuf_unittest.default_sfixed64_extension]: 410
+[protobuf_unittest.default_float_extension]: 411
+[protobuf_unittest.default_double_extension]: 412
+[protobuf_unittest.default_bool_extension]: false
+[protobuf_unittest.default_string_extension]: "415"
+[protobuf_unittest.default_bytes_extension]: "416"
+[protobuf_unittest.default_nested_enum_extension]: FOO
+[protobuf_unittest.default_foreign_enum_extension]: FOREIGN_FOO
+[protobuf_unittest.default_import_enum_extension]: IMPORT_FOO
+[protobuf_unittest.default_string_piece_extension]: "424"
+[protobuf_unittest.default_cord_extension]: "425"
diff --git a/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/wire_format_test.py b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/wire_format_test.py
new file mode 100755
index 0000000..7600778
--- /dev/null
+++ b/python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/wire_format_test.py
@@ -0,0 +1,253 @@
+#! /usr/bin/python
+#
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# http://code.google.com/p/protobuf/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""Test for google.protobuf.internal.wire_format."""
+
+__author__ = 'robinson@google.com (Will Robinson)'
+
+import unittest
+from google.protobuf import message
+from google.protobuf.internal import wire_format
+
+
+class WireFormatTest(unittest.TestCase):
+
+  def testPackTag(self):
+    field_number = 0xabc
+    tag_type = 2
+    self.assertEqual((field_number << 3) | tag_type,
+                     wire_format.PackTag(field_number, tag_type))
+    PackTag = wire_format.PackTag
+    # Number too high.
+    self.assertRaises(message.EncodeError, PackTag, field_number, 6)
+    # Number too low.
+    self.assertRaises(message.EncodeError, PackTag, field_number, -1)
+
+  def testUnpackTag(self):
+    # Test field numbers that will require various varint sizes.
+    for expected_field_number in (1, 15, 16, 2047, 2048):
+      for expected_wire_type in range(6):  # Highest-numbered wiretype is 5.
+        field_number, wire_type = wire_format.UnpackTag(
+            wire_format.PackTag(expected_field_number, expected_wire_type))
+        self.assertEqual(expected_field_number, field_number)
+        self.assertEqual(expected_wire_type, wire_type)
+
+    self.assertRaises(TypeError, wire_format.UnpackTag, None)
+    self.assertRaises(TypeError, wire_format.UnpackTag, 'abc')
+    self.assertRaises(TypeError, wire_format.UnpackTag, 0.0)
+    self.assertRaises(TypeError, wire_format.UnpackTag, object())
+
+  def testZigZagEncode(self):
+    Z = wire_format.ZigZagEncode
+    self.assertEqual(0, Z(0))
+    self.assertEqual(1, Z(-1))
+    self.assertEqual(2, Z(1))
+    self.assertEqual(3, Z(-2))
+    self.assertEqual(4, Z(2))
+    self.assertEqual(0xfffffffe, Z(0x7fffffff))
+    self.assertEqual(0xffffffff, Z(-0x80000000))
+    self.assertEqual(0xfffffffffffffffe, Z(0x7fffffffffffffff))
+    self.assertEqual(0xffffffffffffffff, Z(-0x8000000000000000))
+
+    self.assertRaises(TypeError, Z, None)
+    self.assertRaises(TypeError, Z, 'abcd')
+    self.assertRaises(TypeError, Z, 0.0)
+    self.assertRaises(TypeError, Z, object())
+
+  def testZigZagDecode(self):
+    Z = wire_format.ZigZagDecode
+    self.assertEqual(0, Z(0))
+    self.assertEqual(-1, Z(1))
+    self.assertEqual(1, Z(2))
+    self.assertEqual(-2, Z(3))
+    self.assertEqual(2, Z(4))
+    self.assertEqual(0x7fffffff, Z(0xfffffffe))
+    self.assertEqual(-0x80000000, Z(0xffffffff))
+    self.assertEqual(0x7fffffffffffffff, Z(0xfffffffffffffffe))
+    self.assertEqual(-0x8000000000000000, Z(0xffffffffffffffff))
+
+    self.assertRaises(TypeError, Z, None)
+    self.assertRaises(TypeError, Z, 'abcd')
+    self.assertRaises(TypeError, Z, 0.0)
+    self.assertRaises(TypeError, Z, object())
+
+  def NumericByteSizeTestHelper(self, byte_size_fn, value, expected_value_size):
+    # Use field numbers that cause various byte sizes for the tag information.
+    for field_number, tag_bytes in ((15, 1), (16, 2), (2047, 2), (2048, 3)):
+      expected_size = expected_value_size + tag_bytes
+      actual_size = byte_size_fn(field_number, value)
+      self.assertEqual(expected_size, actual_size,
+                       'byte_size_fn: %s, field_number: %d, value: %r\n'
+                       'Expected: %d, Actual: %d'% (
+          byte_size_fn, field_number, value, expected_size, actual_size))
+
+  def testByteSizeFunctions(self):
+    # Test all numeric *ByteSize() functions.
+    NUMERIC_ARGS = [
+        # Int32ByteSize().
+        [wire_format.Int32ByteSize, 0, 1],
+        [wire_format.Int32ByteSize, 127, 1],
+        [wire_format.Int32ByteSize, 128, 2],
+        [wire_format.Int32ByteSize, -1, 10],
+        # Int64ByteSize().
+        [wire_format.Int64ByteSize, 0, 1],
+        [wire_format.Int64ByteSize, 127, 1],
+        [wire_format.Int64ByteSize, 128, 2],
+        [wire_format.Int64ByteSize, -1, 10],
+        # UInt32ByteSize().
+        [wire_format.UInt32ByteSize, 0, 1],
+        [wire_format.UInt32ByteSize, 127, 1],
+        [wire_format.UInt32ByteSize, 128, 2],
+        [wire_format.UInt32ByteSize, wire_format.UINT32_MAX, 5],
+        # UInt64ByteSize().
+        [wire_format.UInt64ByteSize, 0, 1],
+        [wire_format.UInt64ByteSize, 127, 1],
+        [wire_format.UInt64ByteSize, 128, 2],
+        [wire_format.UInt64ByteSize, wire_format.UINT64_MAX, 10],
+        # SInt32ByteSize().
+        [wire_format.SInt32ByteSize, 0, 1],
+        [wire_format.SInt32ByteSize, -1, 1],
+        [wire_format.SInt32ByteSize, 1, 1],
+        [wire_format.SInt32ByteSize, -63, 1],
+        [wire_format.SInt32ByteSize, 63, 1],
+        [wire_format.SInt32ByteSize, -64, 1],
+        [wire_format.SInt32ByteSize, 64, 2],
+        # SInt64ByteSize().
+        [wire_format.SInt64ByteSize, 0, 1],
+        [wire_format.SInt64ByteSize, -1, 1],
+        [wire_format.SInt64ByteSize, 1, 1],
+        [wire_format.SInt64ByteSize, -63, 1],
+        [wire_format.SInt64ByteSize, 63, 1],
+        [wire_format.SInt64ByteSize, -64, 1],
+        [wire_format.SInt64ByteSize, 64, 2],
+        # Fixed32ByteSize().
+        [wire_format.Fixed32ByteSize, 0, 4],
+        [wire_format.Fixed32ByteSize, wire_format.UINT32_MAX, 4],
+        # Fixed64ByteSize().
+        [wire_format.Fixed64ByteSize, 0, 8],
+        [wire_format.Fixed64ByteSize, wire_format.UINT64_MAX, 8],
+        # SFixed32ByteSize().
+        [wire_format.SFixed32ByteSize, 0, 4],
+        [wire_format.SFixed32ByteSize, wire_format.INT32_MIN, 4],
+        [wire_format.SFixed32ByteSize, wire_format.INT32_MAX, 4],
+        # SFixed64ByteSize().
+        [wire_format.SFixed64ByteSize, 0, 8],
+        [wire_format.SFixed64ByteSize, wire_format.INT64_MIN, 8],
+        [wire_format.SFixed64ByteSize, wire_format.INT64_MAX, 8],
+        # FloatByteSize().
+        [wire_format.FloatByteSize, 0.0, 4],
+        [wire_format.FloatByteSize, 1000000000.0, 4],
+        [wire_format.FloatByteSize, -1000000000.0, 4],
+        # DoubleByteSize().
+        [wire_format.DoubleByteSize, 0.0, 8],
+        [wire_format.DoubleByteSize, 1000000000.0, 8],
+        [wire_format.DoubleByteSize, -1000000000.0, 8],
+        # BoolByteSize().
+        [wire_format.BoolByteSize, False, 1],
+        [wire_format.BoolByteSize, True, 1],
+        # EnumByteSize().
+        [wire_format.EnumByteSize, 0, 1],
+        [wire_format.EnumByteSize, 127, 1],
+        [wire_format.EnumByteSize, 128, 2],
+        [wire_format.EnumByteSize, wire_format.UINT32_MAX, 5],
+        ]
+    for args in NUMERIC_ARGS:
+      self.NumericByteSizeTestHelper(*args)
+
+    # Test strings and bytes.
+    for byte_size_fn in (wire_format.StringByteSize, wire_format.BytesByteSize):
+      # 1 byte for tag, 1 byte for length, 3 bytes for contents.
+      self.assertEqual(5, byte_size_fn(10, 'abc'))
+      # 2 bytes for tag, 1 byte for length, 3 bytes for contents.
+      self.assertEqual(6, byte_size_fn(16, 'abc'))
+      # 2 bytes for tag, 2 bytes for length, 128 bytes for contents.
+      self.assertEqual(132, byte_size_fn(16, 'a' * 128))
+
+    # Test UTF-8 string byte size calculation.
+    # 1 byte for tag, 1 byte for length, 8 bytes for content.
+    self.assertEqual(10, wire_format.StringByteSize(
+        5, unicode('\xd0\xa2\xd0\xb5\xd1\x81\xd1\x82', 'utf-8')))
+
+    class MockMessage(object):
+      def __init__(self, byte_size):
+        self.byte_size = byte_size
+      def ByteSize(self):
+        return self.byte_size
+
+    message_byte_size = 10
+    mock_message = MockMessage(byte_size=message_byte_size)
+    # Test groups.
+    # (2 * 1) bytes for begin and end tags, plus message_byte_size.
+    self.assertEqual(2 + message_byte_size,
+                     wire_format.GroupByteSize(1, mock_message))
+    # (2 * 2) bytes for begin and end tags, plus message_byte_size.
+    self.assertEqual(4 + message_byte_size,
+                     wire_format.GroupByteSize(16, mock_message))
+
+    # Test messages.
+    # 1 byte for tag, plus 1 byte for length, plus contents.
+    self.assertEqual(2 + mock_message.byte_size,
+                     wire_format.MessageByteSize(1, mock_message))
+    # 2 bytes for tag, plus 1 byte for length, plus contents.
+    self.assertEqual(3 + mock_message.byte_size,
+                     wire_format.MessageByteSize(16, mock_message))
+    # 2 bytes for tag, plus 2 bytes for length, plus contents.
+    mock_message.byte_size = 128
+    self.assertEqual(4 + mock_message.byte_size,
+                     wire_format.MessageByteSize(16, mock_message))
+
+
+    # Test message set item byte size.
+    # 4 bytes for tags, plus 1 byte for length, plus 1 byte for type_id,
+    # plus contents.
+    mock_message.byte_size = 10
+    self.assertEqual(mock_message.byte_size + 6,
+                     wire_format.MessageSetItemByteSize(1, mock_message))
+
+    # 4 bytes for tags, plus 2 bytes for length, plus 1 byte for type_id,
+    # plus contents.
+    mock_message.byte_size = 128
+    self.assertEqual(mock_message.byte_size + 7,
+                     wire_format.MessageSetItemByteSize(1, mock_message))
+
+    # 4 bytes for tags, plus 2 bytes for length, plus 2 byte for type_id,
+    # plus contents.
+    self.assertEqual(mock_message.byte_size + 8,
+                     wire_format.MessageSetItemByteSize(128, mock_message))
+
+    # Too-long varint.
+    self.assertRaises(message.EncodeError,
+                      wire_format.UInt64ByteSize, 1, 1 << 128)
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/python/google/__init__.py b/python/google/__init__.py
index de40ea7..5585614 100755
--- a/python/google/__init__.py
+++ b/python/google/__init__.py
@@ -1 +1,4 @@
-__import__('pkg_resources').declare_namespace(__name__)
+try:
+  __import__('pkg_resources').declare_namespace(__name__)
+except ImportError:
+  __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/google/protobuf/__init__.py b/python/google/protobuf/__init__.py
index 533821c..249e18a 100755
--- a/python/google/protobuf/__init__.py
+++ b/python/google/protobuf/__init__.py
@@ -30,4 +30,10 @@
 
 # Copyright 2007 Google Inc. All Rights Reserved.
 
-__version__ = '3.0.0b2'
+__version__ = '3.6.1'
+
+if __name__ != '__main__':
+  try:
+    __import__('pkg_resources').declare_namespace(__name__)
+  except ImportError:
+    __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/google/protobuf/compiler/__init__.py b/python/google/protobuf/compiler/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/python/google/protobuf/compiler/__init__.py
diff --git a/python/google/protobuf/descriptor.py b/python/google/protobuf/descriptor.py
index 5f613c8..8a9ba3d 100755
--- a/python/google/protobuf/descriptor.py
+++ b/python/google/protobuf/descriptor.py
@@ -34,6 +34,7 @@
 
 __author__ = 'robinson@google.com (Will Robinson)'
 
+import threading
 import six
 
 from google.protobuf.internal import api_implementation
@@ -41,8 +42,8 @@
 _USE_C_DESCRIPTORS = False
 if api_implementation.Type() == 'cpp':
   # Used by MakeDescriptor in cpp mode
+  import binascii
   import os
-  import uuid
   from google.protobuf.pyext import _message
   _USE_C_DESCRIPTORS = getattr(_message, '_USE_C_DESCRIPTORS', False)
 
@@ -72,6 +73,24 @@
   DescriptorMetaclass = type
 
 
+class _Lock(object):
+  """Wrapper class of threading.Lock(), which is allowed by 'with'."""
+
+  def __new__(cls):
+    self = object.__new__(cls)
+    self._lock = threading.Lock()  # pylint: disable=protected-access
+    return self
+
+  def __enter__(self):
+    self._lock.acquire()
+
+  def __exit__(self, exc_type, exc_value, exc_tb):
+    self._lock.release()
+
+
+_lock = threading.Lock()
+
+
 class DescriptorBase(six.with_metaclass(DescriptorMetaclass)):
 
   """Descriptors base class.
@@ -92,16 +111,17 @@
     # subclasses" of this descriptor class.
     _C_DESCRIPTOR_CLASS = ()
 
-  def __init__(self, options, options_class_name):
+  def __init__(self, options, serialized_options, options_class_name):
     """Initialize the descriptor given its options message and the name of the
     class of the options message. The name of the class is required in case
     the options message is None and has to be created.
     """
     self._options = options
     self._options_class_name = options_class_name
+    self._serialized_options = serialized_options
 
     # Does this descriptor have non-default options?
-    self.has_options = options is not None
+    self.has_options = (options is not None) or (serialized_options is not None)
 
   def _SetOptions(self, options, options_class_name):
     """Sets the descriptor's options
@@ -123,14 +143,23 @@
     """
     if self._options:
       return self._options
+
     from google.protobuf import descriptor_pb2
     try:
-      options_class = getattr(descriptor_pb2, self._options_class_name)
+      options_class = getattr(descriptor_pb2,
+                              self._options_class_name)
     except AttributeError:
       raise RuntimeError('Unknown options class name %s!' %
                          (self._options_class_name))
-    self._options = options_class()
-    return self._options
+
+    with _lock:
+      if self._serialized_options is None:
+        self._options = options_class()
+      else:
+        self._options = _ParseOptions(options_class(),
+                                      self._serialized_options)
+
+      return self._options
 
 
 class _NestedDescriptorBase(DescriptorBase):
@@ -138,7 +167,7 @@
 
   def __init__(self, options, options_class_name, name, full_name,
                file, containing_type, serialized_start=None,
-               serialized_end=None):
+               serialized_end=None, serialized_options=None):
     """Constructor.
 
     Args:
@@ -157,9 +186,10 @@
         file.serialized_pb that describes this descriptor.
       serialized_end: The end index (exclusive) in block in the
         file.serialized_pb that describes this descriptor.
+      serialized_options: Protocol message serilized options or None.
     """
     super(_NestedDescriptorBase, self).__init__(
-        options, options_class_name)
+        options, serialized_options, options_class_name)
 
     self.name = name
     # TODO(falk): Add function to calculate full_name instead of having it in
@@ -171,13 +201,6 @@
     self._serialized_start = serialized_start
     self._serialized_end = serialized_end
 
-  def GetTopLevelContainingType(self):
-    """Returns the root if this is a nested type, or itself if its the root."""
-    desc = self
-    while desc.containing_type is not None:
-      desc = desc.containing_type
-    return desc
-
   def CopyToProto(self, proto):
     """Copies this to the matching proto in descriptor_pb2.
 
@@ -257,8 +280,9 @@
 
     def __new__(cls, name, full_name, filename, containing_type, fields,
                 nested_types, enum_types, extensions, options=None,
+                serialized_options=None,
                 is_extendable=True, extension_ranges=None, oneofs=None,
-                file=None, serialized_start=None, serialized_end=None,
+                file=None, serialized_start=None, serialized_end=None,  # pylint: disable=redefined-builtin
                 syntax=None):
       _message.Message._CheckCalledFromGeneratedFile()
       return _message.default_pool.FindMessageTypeByName(full_name)
@@ -268,9 +292,10 @@
   # name of the argument.
   def __init__(self, name, full_name, filename, containing_type, fields,
                nested_types, enum_types, extensions, options=None,
+               serialized_options=None,
                is_extendable=True, extension_ranges=None, oneofs=None,
-               file=None, serialized_start=None, serialized_end=None,
-               syntax=None):  # pylint:disable=redefined-builtin
+               file=None, serialized_start=None, serialized_end=None,  # pylint: disable=redefined-builtin
+               syntax=None):
     """Arguments to __init__() are as described in the description
     of Descriptor fields above.
 
@@ -280,7 +305,7 @@
     super(Descriptor, self).__init__(
         options, 'MessageOptions', name, full_name, file,
         containing_type, serialized_start=serialized_start,
-        serialized_end=serialized_end)
+        serialized_end=serialized_end, serialized_options=serialized_options)
 
     # We have fields in addition to fields_by_name and fields_by_number,
     # so that:
@@ -349,7 +374,7 @@
     Args:
       proto: An empty descriptor_pb2.DescriptorProto.
     """
-    # This function is overriden to give a better doc comment.
+    # This function is overridden to give a better doc comment.
     super(Descriptor, self).CopyToProto(proto)
 
 
@@ -413,6 +438,8 @@
 
     containing_oneof: (OneofDescriptor) If the field is a member of a oneof
       union, contains its descriptor. Otherwise, None.
+
+    file: (FileDescriptor) Reference to file descriptor.
   """
 
   # Must be consistent with C++ FieldDescriptor::Type enum in
@@ -497,7 +524,9 @@
     def __new__(cls, name, full_name, index, number, type, cpp_type, label,
                 default_value, message_type, enum_type, containing_type,
                 is_extension, extension_scope, options=None,
-                has_default_value=True, containing_oneof=None):
+                serialized_options=None,
+                has_default_value=True, containing_oneof=None, json_name=None,
+                file=None):  # pylint: disable=redefined-builtin
       _message.Message._CheckCalledFromGeneratedFile()
       if is_extension:
         return _message.default_pool.FindExtensionByName(full_name)
@@ -507,7 +536,9 @@
   def __init__(self, name, full_name, index, number, type, cpp_type, label,
                default_value, message_type, enum_type, containing_type,
                is_extension, extension_scope, options=None,
-               has_default_value=True, containing_oneof=None):
+               serialized_options=None,
+               has_default_value=True, containing_oneof=None, json_name=None,
+               file=None):  # pylint: disable=redefined-builtin
     """The arguments are as described in the description of FieldDescriptor
     attributes above.
 
@@ -515,10 +546,16 @@
     (to deal with circular references between message types, for example).
     Likewise for extension_scope.
     """
-    super(FieldDescriptor, self).__init__(options, 'FieldOptions')
+    super(FieldDescriptor, self).__init__(
+        options, serialized_options, 'FieldOptions')
     self.name = name
     self.full_name = full_name
+    self.file = file
     self._camelcase_name = None
+    if json_name is None:
+      self.json_name = _ToJsonName(name)
+    else:
+      self.json_name = json_name
     self.index = index
     self.number = number
     self.type = type
@@ -596,13 +633,15 @@
     _C_DESCRIPTOR_CLASS = _message.EnumDescriptor
 
     def __new__(cls, name, full_name, filename, values,
-                containing_type=None, options=None, file=None,
+                containing_type=None, options=None,
+                serialized_options=None, file=None,  # pylint: disable=redefined-builtin
                 serialized_start=None, serialized_end=None):
       _message.Message._CheckCalledFromGeneratedFile()
       return _message.default_pool.FindEnumTypeByName(full_name)
 
   def __init__(self, name, full_name, filename, values,
-               containing_type=None, options=None, file=None,
+               containing_type=None, options=None,
+               serialized_options=None, file=None,  # pylint: disable=redefined-builtin
                serialized_start=None, serialized_end=None):
     """Arguments are as described in the attribute description above.
 
@@ -612,7 +651,7 @@
     super(EnumDescriptor, self).__init__(
         options, 'EnumOptions', name, full_name, file,
         containing_type, serialized_start=serialized_start,
-        serialized_end=serialized_end)
+        serialized_end=serialized_end, serialized_options=serialized_options)
 
     self.values = values
     for value in self.values:
@@ -626,7 +665,7 @@
     Args:
       proto: An empty descriptor_pb2.EnumDescriptorProto.
     """
-    # This function is overriden to give a better doc comment.
+    # This function is overridden to give a better doc comment.
     super(EnumDescriptor, self).CopyToProto(proto)
 
 
@@ -648,7 +687,9 @@
   if _USE_C_DESCRIPTORS:
     _C_DESCRIPTOR_CLASS = _message.EnumValueDescriptor
 
-    def __new__(cls, name, index, number, type=None, options=None):
+    def __new__(cls, name, index, number,
+                type=None,  # pylint: disable=redefined-builtin
+                options=None, serialized_options=None):
       _message.Message._CheckCalledFromGeneratedFile()
       # There is no way we can build a complete EnumValueDescriptor with the
       # given parameters (the name of the Enum is not known, for example).
@@ -656,16 +697,19 @@
       # constructor, which will ignore it, so returning None is good enough.
       return None
 
-  def __init__(self, name, index, number, type=None, options=None):
+  def __init__(self, name, index, number,
+               type=None,  # pylint: disable=redefined-builtin
+               options=None, serialized_options=None):
     """Arguments are as described in the attribute description above."""
-    super(EnumValueDescriptor, self).__init__(options, 'EnumValueOptions')
+    super(EnumValueDescriptor, self).__init__(
+        options, serialized_options, 'EnumValueOptions')
     self.name = name
     self.index = index
     self.number = number
     self.type = type
 
 
-class OneofDescriptor(object):
+class OneofDescriptor(DescriptorBase):
   """Descriptor for a oneof field.
 
     name: (str) Name of the oneof field.
@@ -682,12 +726,18 @@
   if _USE_C_DESCRIPTORS:
     _C_DESCRIPTOR_CLASS = _message.OneofDescriptor
 
-    def __new__(cls, name, full_name, index, containing_type, fields):
+    def __new__(
+        cls, name, full_name, index, containing_type, fields, options=None,
+        serialized_options=None):
       _message.Message._CheckCalledFromGeneratedFile()
       return _message.default_pool.FindOneofByName(full_name)
 
-  def __init__(self, name, full_name, index, containing_type, fields):
+  def __init__(
+      self, name, full_name, index, containing_type, fields, options=None,
+      serialized_options=None):
     """Arguments are as described in the attribute description above."""
+    super(OneofDescriptor, self).__init__(
+        options, serialized_options, 'OneofOptions')
     self.name = name
     self.full_name = full_name
     self.index = index
@@ -705,29 +755,40 @@
       definition appears withing the .proto file.
     methods: (list of MethodDescriptor) List of methods provided by this
       service.
+    methods_by_name: (dict str -> MethodDescriptor) Same MethodDescriptor
+      objects as in |methods_by_name|, but indexed by "name" attribute in each
+      MethodDescriptor.
     options: (descriptor_pb2.ServiceOptions) Service options message or
       None to use default service options.
     file: (FileDescriptor) Reference to file info.
   """
 
-  def __init__(self, name, full_name, index, methods, options=None, file=None,
+  if _USE_C_DESCRIPTORS:
+    _C_DESCRIPTOR_CLASS = _message.ServiceDescriptor
+
+    def __new__(cls, name, full_name, index, methods, options=None,
+                serialized_options=None, file=None,  # pylint: disable=redefined-builtin
+                serialized_start=None, serialized_end=None):
+      _message.Message._CheckCalledFromGeneratedFile()  # pylint: disable=protected-access
+      return _message.default_pool.FindServiceByName(full_name)
+
+  def __init__(self, name, full_name, index, methods, options=None,
+               serialized_options=None, file=None,  # pylint: disable=redefined-builtin
                serialized_start=None, serialized_end=None):
     super(ServiceDescriptor, self).__init__(
         options, 'ServiceOptions', name, full_name, file,
         None, serialized_start=serialized_start,
-        serialized_end=serialized_end)
+        serialized_end=serialized_end, serialized_options=serialized_options)
     self.index = index
     self.methods = methods
+    self.methods_by_name = dict((m.name, m) for m in methods)
     # Set the containing service for each method in this service.
     for method in self.methods:
       method.containing_service = self
 
   def FindMethodByName(self, name):
     """Searches for the specified method, and returns its descriptor."""
-    for method in self.methods:
-      if name == method.name:
-        return method
-    return None
+    return self.methods_by_name.get(name, None)
 
   def CopyToProto(self, proto):
     """Copies this to a descriptor_pb2.ServiceDescriptorProto.
@@ -735,7 +796,7 @@
     Args:
       proto: An empty descriptor_pb2.ServiceDescriptorProto.
     """
-    # This function is overriden to give a better doc comment.
+    # This function is overridden to give a better doc comment.
     super(ServiceDescriptor, self).CopyToProto(proto)
 
 
@@ -754,14 +815,23 @@
     None to use default method options.
   """
 
+  if _USE_C_DESCRIPTORS:
+    _C_DESCRIPTOR_CLASS = _message.MethodDescriptor
+
+    def __new__(cls, name, full_name, index, containing_service,
+                input_type, output_type, options=None, serialized_options=None):
+      _message.Message._CheckCalledFromGeneratedFile()  # pylint: disable=protected-access
+      return _message.default_pool.FindMethodByName(full_name)
+
   def __init__(self, name, full_name, index, containing_service,
-               input_type, output_type, options=None):
+               input_type, output_type, options=None, serialized_options=None):
     """The arguments are as described in the description of MethodDescriptor
     attributes above.
 
     Note that containing_service may be None, and may be set later if necessary.
     """
-    super(MethodDescriptor, self).__init__(options, 'MethodOptions')
+    super(MethodDescriptor, self).__init__(
+        options, serialized_options, 'MethodOptions')
     self.name = name
     self.full_name = full_name
     self.index = index
@@ -783,9 +853,12 @@
   serialized_pb: (str) Byte string of serialized
     descriptor_pb2.FileDescriptorProto.
   dependencies: List of other FileDescriptors this FileDescriptor depends on.
+  public_dependencies: A list of FileDescriptors, subset of the dependencies
+    above, which were declared as "public".
   message_types_by_name: Dict of message names of their descriptors.
   enum_types_by_name: Dict of enum names and their descriptors.
   extensions_by_name: Dict of extension names and their descriptors.
+  services_by_name: Dict of services names and their descriptors.
   pool: the DescriptorPool this descriptor belongs to.  When not passed to the
     constructor, the global default pool is used.
   """
@@ -793,8 +866,10 @@
   if _USE_C_DESCRIPTORS:
     _C_DESCRIPTOR_CLASS = _message.FileDescriptor
 
-    def __new__(cls, name, package, options=None, serialized_pb=None,
-                dependencies=None, syntax=None, pool=None):
+    def __new__(cls, name, package, options=None,
+                serialized_options=None, serialized_pb=None,
+                dependencies=None, public_dependencies=None,
+                syntax=None, pool=None):
       # FileDescriptor() is called from various places, not only from generated
       # files, to register dynamic proto files and messages.
       if serialized_pb:
@@ -804,10 +879,13 @@
       else:
         return super(FileDescriptor, cls).__new__(cls)
 
-  def __init__(self, name, package, options=None, serialized_pb=None,
-               dependencies=None, syntax=None, pool=None):
+  def __init__(self, name, package, options=None,
+               serialized_options=None, serialized_pb=None,
+               dependencies=None, public_dependencies=None,
+               syntax=None, pool=None):
     """Constructor."""
-    super(FileDescriptor, self).__init__(options, 'FileOptions')
+    super(FileDescriptor, self).__init__(
+        options, serialized_options, 'FileOptions')
 
     if pool is None:
       from google.protobuf import descriptor_pool
@@ -821,7 +899,9 @@
 
     self.enum_types_by_name = {}
     self.extensions_by_name = {}
+    self.services_by_name = {}
     self.dependencies = (dependencies or [])
+    self.public_dependencies = (public_dependencies or [])
 
     if (api_implementation.Type() == 'cpp' and
         self.serialized_pb is not None):
@@ -867,6 +947,31 @@
   return ''.join(result)
 
 
+def _OptionsOrNone(descriptor_proto):
+  """Returns the value of the field `options`, or None if it is not set."""
+  if descriptor_proto.HasField('options'):
+    return descriptor_proto.options
+  else:
+    return None
+
+
+def _ToJsonName(name):
+  """Converts name to Json name and returns it."""
+  capitalize_next = False
+  result = []
+
+  for c in name:
+    if c == '_':
+      capitalize_next = True
+    elif capitalize_next:
+      result.append(c.upper())
+      capitalize_next = False
+    else:
+      result += c
+
+  return ''.join(result)
+
+
 def MakeDescriptor(desc_proto, package='', build_file_if_cpp=True,
                    syntax=None):
   """Make a protobuf Descriptor given a DescriptorProto protobuf.
@@ -898,7 +1003,7 @@
     # imported ones. We need to specify a file name so the descriptor pool
     # accepts our FileDescriptorProto, but it is not important what that file
     # name is actually set to.
-    proto_name = str(uuid.uuid4())
+    proto_name = binascii.hexlify(os.urandom(16)).decode('ascii')
 
     if package:
       file_descriptor_proto.name = os.path.join(package.replace('.', '/'),
@@ -943,6 +1048,10 @@
     full_name = '.'.join(full_message_name + [field_proto.name])
     enum_desc = None
     nested_desc = None
+    if field_proto.json_name:
+      json_name = field_proto.json_name
+    else:
+      json_name = None
     if field_proto.HasField('type_name'):
       type_name = field_proto.type_name
       full_type_name = '.'.join(full_message_name +
@@ -957,10 +1066,11 @@
         field_proto.number, field_proto.type,
         FieldDescriptor.ProtoTypeToCppProtoType(field_proto.type),
         field_proto.label, None, nested_desc, enum_desc, None, False, None,
-        options=field_proto.options, has_default_value=False)
+        options=_OptionsOrNone(field_proto), has_default_value=False,
+        json_name=json_name)
     fields.append(field)
 
   desc_name = '.'.join(full_message_name)
   return Descriptor(desc_proto.name, desc_name, None, None, fields,
                     list(nested_types.values()), list(enum_types.values()), [],
-                    options=desc_proto.options)
+                    options=_OptionsOrNone(desc_proto))
diff --git a/python/google/protobuf/descriptor_database.py b/python/google/protobuf/descriptor_database.py
index 1333f99..8b7715c 100644
--- a/python/google/protobuf/descriptor_database.py
+++ b/python/google/protobuf/descriptor_database.py
@@ -32,6 +32,8 @@
 
 __author__ = 'matthewtoia@google.com (Matt Toia)'
 
+import warnings
+
 
 class Error(Exception):
   pass
@@ -54,9 +56,9 @@
     Args:
       file_desc_proto: The FileDescriptorProto to add.
     Raises:
-      DescriptorDatabaseException: if an attempt is made to add a proto
-        with the same name but different definition than an exisiting
-        proto in the database.
+      DescriptorDatabaseConflictingDefinitionError: if an attempt is made to
+        add a proto with the same name but different definition than an
+        exisiting proto in the database.
     """
     proto_name = file_desc_proto.name
     if proto_name not in self._file_desc_protos_by_file:
@@ -64,18 +66,20 @@
     elif self._file_desc_protos_by_file[proto_name] != file_desc_proto:
       raise DescriptorDatabaseConflictingDefinitionError(
           '%s already added, but with different descriptor.' % proto_name)
+    else:
+      return
 
-    # Add the top-level Message, Enum and Extension descriptors to the index.
+    # Add all the top-level descriptors to the index.
     package = file_desc_proto.package
     for message in file_desc_proto.message_type:
-      self._file_desc_protos_by_symbol.update(
-          (name, file_desc_proto) for name in _ExtractSymbols(message, package))
+      for name in _ExtractSymbols(message, package):
+        self._AddSymbol(name, file_desc_proto)
     for enum in file_desc_proto.enum_type:
-      self._file_desc_protos_by_symbol[
-          '.'.join((package, enum.name))] = file_desc_proto
+      self._AddSymbol(('.'.join((package, enum.name))), file_desc_proto)
     for extension in file_desc_proto.extension:
-      self._file_desc_protos_by_symbol[
-          '.'.join((package, extension.name))] = file_desc_proto
+      self._AddSymbol(('.'.join((package, extension.name))), file_desc_proto)
+    for service in file_desc_proto.service:
+      self._AddSymbol(('.'.join((package, service.name))), file_desc_proto)
 
   def FindFileByName(self, name):
     """Finds the file descriptor proto by file name.
@@ -104,6 +108,7 @@
 
     'some.package.name.Message'
     'some.package.name.Message.NestedEnum'
+    'some.package.name.Message.some_field'
 
     The file descriptor proto containing the specified symbol must be added to
     this database using the Add method or else an error will be raised.
@@ -117,8 +122,25 @@
     Raises:
       KeyError if no file contains the specified symbol.
     """
+    try:
+      return self._file_desc_protos_by_symbol[symbol]
+    except KeyError:
+      # Fields, enum values, and nested extensions are not in
+      # _file_desc_protos_by_symbol. Try to find the top level
+      # descriptor. Non-existent nested symbol under a valid top level
+      # descriptor can also be found. The behavior is the same with
+      # protobuf C++.
+      top_level, _, _ = symbol.rpartition('.')
+      return self._file_desc_protos_by_symbol[top_level]
 
-    return self._file_desc_protos_by_symbol[symbol]
+  def _AddSymbol(self, name, file_desc_proto):
+    if name in self._file_desc_protos_by_symbol:
+      warn_msg = ('Conflict register for file "' + file_desc_proto.name +
+                  '": ' + name +
+                  ' is already defined in file "' +
+                  self._file_desc_protos_by_symbol[name].name + '"')
+      warnings.warn(warn_msg, RuntimeWarning)
+    self._file_desc_protos_by_symbol[name] = file_desc_proto
 
 
 def _ExtractSymbols(desc_proto, package):
@@ -131,8 +153,7 @@
   Yields:
     The fully qualified name found in the descriptor.
   """
-
-  message_name = '.'.join((package, desc_proto.name))
+  message_name = package + '.' + desc_proto.name if package else desc_proto.name
   yield message_name
   for nested_type in desc_proto.nested_type:
     for symbol in _ExtractSymbols(nested_type, message_name):
diff --git a/python/google/protobuf/descriptor_pool.py b/python/google/protobuf/descriptor_pool.py
index 3e80795..8983f76 100644
--- a/python/google/protobuf/descriptor_pool.py
+++ b/python/google/protobuf/descriptor_pool.py
@@ -57,12 +57,15 @@
 
 __author__ = 'matthewtoia@google.com (Matt Toia)'
 
+import collections
+import warnings
+
 from google.protobuf import descriptor
 from google.protobuf import descriptor_database
 from google.protobuf import text_encoding
 
 
-_USE_C_DESCRIPTORS = descriptor._USE_C_DESCRIPTORS
+_USE_C_DESCRIPTORS = descriptor._USE_C_DESCRIPTORS  # pylint: disable=protected-access
 
 
 def _NormalizeFullyQualifiedName(name):
@@ -80,6 +83,22 @@
   return name.lstrip('.')
 
 
+def _OptionsOrNone(descriptor_proto):
+  """Returns the value of the field `options`, or None if it is not set."""
+  if descriptor_proto.HasField('options'):
+    return descriptor_proto.options
+  else:
+    return None
+
+
+def _IsMessageSetExtension(field):
+  return (field.is_extension and
+          field.containing_type.has_options and
+          field.containing_type.GetOptions().message_set_wire_format and
+          field.type == descriptor.FieldDescriptor.TYPE_MESSAGE and
+          field.label == descriptor.FieldDescriptor.LABEL_OPTIONAL)
+
+
 class DescriptorPool(object):
   """A collection of protobufs dynamically constructed by descriptor protos."""
 
@@ -106,7 +125,40 @@
     self._descriptor_db = descriptor_db
     self._descriptors = {}
     self._enum_descriptors = {}
+    self._service_descriptors = {}
     self._file_descriptors = {}
+    self._toplevel_extensions = {}
+    # TODO(jieluo): Remove _file_desc_by_toplevel_extension after
+    # maybe year 2020 for compatibility issue (with 3.4.1 only).
+    self._file_desc_by_toplevel_extension = {}
+    # We store extensions in two two-level mappings: The first key is the
+    # descriptor of the message being extended, the second key is the extension
+    # full name or its tag number.
+    self._extensions_by_name = collections.defaultdict(dict)
+    self._extensions_by_number = collections.defaultdict(dict)
+
+  def _CheckConflictRegister(self, desc):
+    """Check if the descriptor name conflicts with another of the same name.
+
+    Args:
+      desc: Descriptor of a message, enum, service or extension.
+    """
+    desc_name = desc.full_name
+    for register, descriptor_type in [
+        (self._descriptors, descriptor.Descriptor),
+        (self._enum_descriptors, descriptor.EnumDescriptor),
+        (self._service_descriptors, descriptor.ServiceDescriptor),
+        (self._toplevel_extensions, descriptor.FieldDescriptor)]:
+      if desc_name in register:
+        file_name = register[desc_name].file.name
+        if not isinstance(desc, descriptor_type) or (
+            file_name != desc.file.name):
+          warn_msg = ('Conflict register for file "' + desc.file.name +
+                      '": ' + desc_name +
+                      ' is already defined in file "' +
+                      file_name + '"')
+          warnings.warn(warn_msg, RuntimeWarning)
+        return
 
   def Add(self, file_desc_proto):
     """Adds the FileDescriptorProto and its types to this pool.
@@ -144,13 +196,15 @@
     if not isinstance(desc, descriptor.Descriptor):
       raise TypeError('Expected instance of descriptor.Descriptor.')
 
+    self._CheckConflictRegister(desc)
+
     self._descriptors[desc.full_name] = desc
-    self.AddFileDescriptor(desc.file)
+    self._AddFileDescriptor(desc.file)
 
   def AddEnumDescriptor(self, enum_desc):
     """Adds an EnumDescriptor to the pool.
 
-    This method also registers the FileDescriptor associated with the message.
+    This method also registers the FileDescriptor associated with the enum.
 
     Args:
       enum_desc: An EnumDescriptor.
@@ -159,8 +213,65 @@
     if not isinstance(enum_desc, descriptor.EnumDescriptor):
       raise TypeError('Expected instance of descriptor.EnumDescriptor.')
 
+    self._CheckConflictRegister(enum_desc)
     self._enum_descriptors[enum_desc.full_name] = enum_desc
-    self.AddFileDescriptor(enum_desc.file)
+    self._AddFileDescriptor(enum_desc.file)
+
+  def AddServiceDescriptor(self, service_desc):
+    """Adds a ServiceDescriptor to the pool.
+
+    Args:
+      service_desc: A ServiceDescriptor.
+    """
+
+    if not isinstance(service_desc, descriptor.ServiceDescriptor):
+      raise TypeError('Expected instance of descriptor.ServiceDescriptor.')
+
+    self._CheckConflictRegister(service_desc)
+    self._service_descriptors[service_desc.full_name] = service_desc
+
+  def AddExtensionDescriptor(self, extension):
+    """Adds a FieldDescriptor describing an extension to the pool.
+
+    Args:
+      extension: A FieldDescriptor.
+
+    Raises:
+      AssertionError: when another extension with the same number extends the
+        same message.
+      TypeError: when the specified extension is not a
+        descriptor.FieldDescriptor.
+    """
+    if not (isinstance(extension, descriptor.FieldDescriptor) and
+            extension.is_extension):
+      raise TypeError('Expected an extension descriptor.')
+
+    if extension.extension_scope is None:
+      self._CheckConflictRegister(extension)
+      self._toplevel_extensions[extension.full_name] = extension
+
+    try:
+      existing_desc = self._extensions_by_number[
+          extension.containing_type][extension.number]
+    except KeyError:
+      pass
+    else:
+      if extension is not existing_desc:
+        raise AssertionError(
+            'Extensions "%s" and "%s" both try to extend message type "%s" '
+            'with field number %d.' %
+            (extension.full_name, existing_desc.full_name,
+             extension.containing_type.full_name, extension.number))
+
+    self._extensions_by_number[extension.containing_type][
+        extension.number] = extension
+    self._extensions_by_name[extension.containing_type][
+        extension.full_name] = extension
+
+    # Also register MessageSet extensions with the type name.
+    if _IsMessageSetExtension(extension):
+      self._extensions_by_name[extension.containing_type][
+          extension.message_type.full_name] = extension
 
   def AddFileDescriptor(self, file_desc):
     """Adds a FileDescriptor to the pool, non-recursively.
@@ -172,6 +283,24 @@
       file_desc: A FileDescriptor.
     """
 
+    self._AddFileDescriptor(file_desc)
+    # TODO(jieluo): This is a temporary solution for FieldDescriptor.file.
+    # FieldDescriptor.file is added in code gen. Remove this solution after
+    # maybe 2020 for compatibility reason (with 3.4.1 only).
+    for extension in file_desc.extensions_by_name.values():
+      self._file_desc_by_toplevel_extension[
+          extension.full_name] = file_desc
+
+  def _AddFileDescriptor(self, file_desc):
+    """Adds a FileDescriptor to the pool, non-recursively.
+
+    If the FileDescriptor contains messages or enums, the caller must explicitly
+    register them.
+
+    Args:
+      file_desc: A FileDescriptor.
+    """
+
     if not isinstance(file_desc, descriptor.FileDescriptor):
       raise TypeError('Expected instance of descriptor.FileDescriptor.')
     self._file_descriptors[file_desc.name] = file_desc
@@ -186,7 +315,7 @@
       A FileDescriptor for the named file.
 
     Raises:
-      KeyError: if the file can not be found in the pool.
+      KeyError: if the file cannot be found in the pool.
     """
 
     try:
@@ -215,7 +344,7 @@
       A FileDescriptor that contains the specified symbol.
 
     Raises:
-      KeyError: if the file can not be found in the pool.
+      KeyError: if the file cannot be found in the pool.
     """
 
     symbol = _NormalizeFullyQualifiedName(symbol)
@@ -230,15 +359,28 @@
       pass
 
     try:
-      file_proto = self._internal_db.FindFileContainingSymbol(symbol)
-    except KeyError as error:
-      if self._descriptor_db:
-        file_proto = self._descriptor_db.FindFileContainingSymbol(symbol)
-      else:
-        raise error
-    if not file_proto:
+      return self._service_descriptors[symbol].file
+    except KeyError:
+      pass
+
+    try:
+      return self._FindFileContainingSymbolInDb(symbol)
+    except KeyError:
+      pass
+
+    try:
+      return self._file_desc_by_toplevel_extension[symbol]
+    except KeyError:
+      pass
+
+    # Try nested extensions inside a message.
+    message_name, _, extension_name = symbol.rpartition('.')
+    try:
+      message = self.FindMessageTypeByName(message_name)
+      assert message.extensions_by_name[extension_name]
+      return message.file
+    except KeyError:
       raise KeyError('Cannot find a file containing %s' % symbol)
-    return self._ConvertFileProtoToFileDescriptor(file_proto)
 
   def FindMessageTypeByName(self, full_name):
     """Loads the named descriptor from the pool.
@@ -248,11 +390,14 @@
 
     Returns:
       The descriptor for the named type.
+
+    Raises:
+      KeyError: if the message cannot be found in the pool.
     """
 
     full_name = _NormalizeFullyQualifiedName(full_name)
     if full_name not in self._descriptors:
-      self.FindFileContainingSymbol(full_name)
+      self._FindFileContainingSymbolInDb(full_name)
     return self._descriptors[full_name]
 
   def FindEnumTypeByName(self, full_name):
@@ -263,11 +408,14 @@
 
     Returns:
       The enum descriptor for the named type.
+
+    Raises:
+      KeyError: if the enum cannot be found in the pool.
     """
 
     full_name = _NormalizeFullyQualifiedName(full_name)
     if full_name not in self._enum_descriptors:
-      self.FindFileContainingSymbol(full_name)
+      self._FindFileContainingSymbolInDb(full_name)
     return self._enum_descriptors[full_name]
 
   def FindFieldByName(self, full_name):
@@ -278,12 +426,32 @@
 
     Returns:
       The field descriptor for the named field.
+
+    Raises:
+      KeyError: if the field cannot be found in the pool.
     """
     full_name = _NormalizeFullyQualifiedName(full_name)
     message_name, _, field_name = full_name.rpartition('.')
     message_descriptor = self.FindMessageTypeByName(message_name)
     return message_descriptor.fields_by_name[field_name]
 
+  def FindOneofByName(self, full_name):
+    """Loads the named oneof descriptor from the pool.
+
+    Args:
+      full_name: The full name of the oneof descriptor to load.
+
+    Returns:
+      The oneof descriptor for the named oneof.
+
+    Raises:
+      KeyError: if the oneof cannot be found in the pool.
+    """
+    full_name = _NormalizeFullyQualifiedName(full_name)
+    message_name, _, oneof_name = full_name.rpartition('.')
+    message_descriptor = self.FindMessageTypeByName(message_name)
+    return message_descriptor.oneofs_by_name[oneof_name]
+
   def FindExtensionByName(self, full_name):
     """Loads the named extension descriptor from the pool.
 
@@ -292,17 +460,101 @@
 
     Returns:
       A FieldDescriptor, describing the named extension.
+
+    Raises:
+      KeyError: if the extension cannot be found in the pool.
     """
     full_name = _NormalizeFullyQualifiedName(full_name)
+    try:
+      # The proto compiler does not give any link between the FileDescriptor
+      # and top-level extensions unless the FileDescriptorProto is added to
+      # the DescriptorDatabase, but this can impact memory usage.
+      # So we registered these extensions by name explicitly.
+      return self._toplevel_extensions[full_name]
+    except KeyError:
+      pass
     message_name, _, extension_name = full_name.rpartition('.')
     try:
       # Most extensions are nested inside a message.
       scope = self.FindMessageTypeByName(message_name)
     except KeyError:
       # Some extensions are defined at file scope.
-      scope = self.FindFileContainingSymbol(full_name)
+      scope = self._FindFileContainingSymbolInDb(full_name)
     return scope.extensions_by_name[extension_name]
 
+  def FindExtensionByNumber(self, message_descriptor, number):
+    """Gets the extension of the specified message with the specified number.
+
+    Extensions have to be registered to this pool by calling
+    AddExtensionDescriptor.
+
+    Args:
+      message_descriptor: descriptor of the extended message.
+      number: integer, number of the extension field.
+
+    Returns:
+      A FieldDescriptor describing the extension.
+
+    Raises:
+      KeyError: when no extension with the given number is known for the
+        specified message.
+    """
+    return self._extensions_by_number[message_descriptor][number]
+
+  def FindAllExtensions(self, message_descriptor):
+    """Gets all the known extension of a given message.
+
+    Extensions have to be registered to this pool by calling
+    AddExtensionDescriptor.
+
+    Args:
+      message_descriptor: descriptor of the extended message.
+
+    Returns:
+      A list of FieldDescriptor describing the extensions.
+    """
+    return list(self._extensions_by_number[message_descriptor].values())
+
+  def FindServiceByName(self, full_name):
+    """Loads the named service descriptor from the pool.
+
+    Args:
+      full_name: The full name of the service descriptor to load.
+
+    Returns:
+      The service descriptor for the named service.
+
+    Raises:
+      KeyError: if the service cannot be found in the pool.
+    """
+    full_name = _NormalizeFullyQualifiedName(full_name)
+    if full_name not in self._service_descriptors:
+      self._FindFileContainingSymbolInDb(full_name)
+    return self._service_descriptors[full_name]
+
+  def _FindFileContainingSymbolInDb(self, symbol):
+    """Finds the file in descriptor DB containing the specified symbol.
+
+    Args:
+      symbol: The name of the symbol to search for.
+
+    Returns:
+      A FileDescriptor that contains the specified symbol.
+
+    Raises:
+      KeyError: if the file cannot be found in the descriptor database.
+    """
+    try:
+      file_proto = self._internal_db.FindFileContainingSymbol(symbol)
+    except KeyError as error:
+      if self._descriptor_db:
+        file_proto = self._descriptor_db.FindFileContainingSymbol(symbol)
+      else:
+        raise error
+    if not file_proto:
+      raise KeyError('Cannot find a file containing %s' % symbol)
+    return self._ConvertFileProtoToFileDescriptor(file_proto)
+
   def _ConvertFileProtoToFileDescriptor(self, file_proto):
     """Creates a FileDescriptor from a proto or returns a cached copy.
 
@@ -319,78 +571,69 @@
     if file_proto.name not in self._file_descriptors:
       built_deps = list(self._GetDeps(file_proto.dependency))
       direct_deps = [self.FindFileByName(n) for n in file_proto.dependency]
+      public_deps = [direct_deps[i] for i in file_proto.public_dependency]
 
       file_descriptor = descriptor.FileDescriptor(
           pool=self,
           name=file_proto.name,
           package=file_proto.package,
           syntax=file_proto.syntax,
-          options=file_proto.options,
+          options=_OptionsOrNone(file_proto),
           serialized_pb=file_proto.SerializeToString(),
-          dependencies=direct_deps)
-      if _USE_C_DESCRIPTORS:
-        # When using C++ descriptors, all objects defined in the file were added
-        # to the C++ database when the FileDescriptor was built above.
-        # Just add them to this descriptor pool.
-        def _AddMessageDescriptor(message_desc):
-          self._descriptors[message_desc.full_name] = message_desc
-          for nested in message_desc.nested_types:
-            _AddMessageDescriptor(nested)
-          for enum_type in message_desc.enum_types:
-            _AddEnumDescriptor(enum_type)
-        def _AddEnumDescriptor(enum_desc):
-          self._enum_descriptors[enum_desc.full_name] = enum_desc
-        for message_type in file_descriptor.message_types_by_name.values():
-          _AddMessageDescriptor(message_type)
-        for enum_type in file_descriptor.enum_types_by_name.values():
-          _AddEnumDescriptor(enum_type)
+          dependencies=direct_deps,
+          public_dependencies=public_deps)
+      scope = {}
+
+      # This loop extracts all the message and enum types from all the
+      # dependencies of the file_proto. This is necessary to create the
+      # scope of available message types when defining the passed in
+      # file proto.
+      for dependency in built_deps:
+        scope.update(self._ExtractSymbols(
+            dependency.message_types_by_name.values()))
+        scope.update((_PrefixWithDot(enum.full_name), enum)
+                     for enum in dependency.enum_types_by_name.values())
+
+      for message_type in file_proto.message_type:
+        message_desc = self._ConvertMessageDescriptor(
+            message_type, file_proto.package, file_descriptor, scope,
+            file_proto.syntax)
+        file_descriptor.message_types_by_name[message_desc.name] = (
+            message_desc)
+
+      for enum_type in file_proto.enum_type:
+        file_descriptor.enum_types_by_name[enum_type.name] = (
+            self._ConvertEnumDescriptor(enum_type, file_proto.package,
+                                        file_descriptor, None, scope))
+
+      for index, extension_proto in enumerate(file_proto.extension):
+        extension_desc = self._MakeFieldDescriptor(
+            extension_proto, file_proto.package, index, file_descriptor,
+            is_extension=True)
+        extension_desc.containing_type = self._GetTypeFromScope(
+            file_descriptor.package, extension_proto.extendee, scope)
+        self._SetFieldType(extension_proto, extension_desc,
+                           file_descriptor.package, scope)
+        file_descriptor.extensions_by_name[extension_desc.name] = (
+            extension_desc)
+
+      for desc_proto in file_proto.message_type:
+        self._SetAllFieldTypes(file_proto.package, desc_proto, scope)
+
+      if file_proto.package:
+        desc_proto_prefix = _PrefixWithDot(file_proto.package)
       else:
-        scope = {}
+        desc_proto_prefix = ''
 
-        # This loop extracts all the message and enum types from all the
-        # dependencies of the file_proto. This is necessary to create the
-        # scope of available message types when defining the passed in
-        # file proto.
-        for dependency in built_deps:
-          scope.update(self._ExtractSymbols(
-              dependency.message_types_by_name.values()))
-          scope.update((_PrefixWithDot(enum.full_name), enum)
-                       for enum in dependency.enum_types_by_name.values())
+      for desc_proto in file_proto.message_type:
+        desc = self._GetTypeFromScope(
+            desc_proto_prefix, desc_proto.name, scope)
+        file_descriptor.message_types_by_name[desc_proto.name] = desc
 
-        for message_type in file_proto.message_type:
-          message_desc = self._ConvertMessageDescriptor(
-              message_type, file_proto.package, file_descriptor, scope,
-              file_proto.syntax)
-          file_descriptor.message_types_by_name[message_desc.name] = (
-              message_desc)
-
-        for enum_type in file_proto.enum_type:
-          file_descriptor.enum_types_by_name[enum_type.name] = (
-              self._ConvertEnumDescriptor(enum_type, file_proto.package,
-                                          file_descriptor, None, scope))
-
-        for index, extension_proto in enumerate(file_proto.extension):
-          extension_desc = self._MakeFieldDescriptor(
-              extension_proto, file_proto.package, index, is_extension=True)
-          extension_desc.containing_type = self._GetTypeFromScope(
-              file_descriptor.package, extension_proto.extendee, scope)
-          self._SetFieldType(extension_proto, extension_desc,
-                            file_descriptor.package, scope)
-          file_descriptor.extensions_by_name[extension_desc.name] = (
-              extension_desc)
-
-        for desc_proto in file_proto.message_type:
-          self._SetAllFieldTypes(file_proto.package, desc_proto, scope)
-
-        if file_proto.package:
-          desc_proto_prefix = _PrefixWithDot(file_proto.package)
-        else:
-          desc_proto_prefix = ''
-
-        for desc_proto in file_proto.message_type:
-          desc = self._GetTypeFromScope(
-              desc_proto_prefix, desc_proto.name, scope)
-          file_descriptor.message_types_by_name[desc_proto.name] = desc
+      for index, service_proto in enumerate(file_proto.service):
+        file_descriptor.services_by_name[service_proto.name] = (
+            self._MakeServiceDescriptor(service_proto, index, scope,
+                                        file_proto.package, file_descriptor))
 
       self.Add(file_proto)
       self._file_descriptors[file_proto.name] = file_descriptor
@@ -406,6 +649,7 @@
       package: The package the proto should be located in.
       file_desc: The file containing this message.
       scope: Dict mapping short and full symbols to message and enum types.
+      syntax: string indicating syntax of the file ("proto2" or "proto3")
 
     Returns:
       The added descriptor.
@@ -431,15 +675,15 @@
     enums = [
         self._ConvertEnumDescriptor(enum, desc_name, file_desc, None, scope)
         for enum in desc_proto.enum_type]
-    fields = [self._MakeFieldDescriptor(field, desc_name, index)
+    fields = [self._MakeFieldDescriptor(field, desc_name, index, file_desc)
               for index, field in enumerate(desc_proto.field)]
     extensions = [
-        self._MakeFieldDescriptor(extension, desc_name, index,
+        self._MakeFieldDescriptor(extension, desc_name, index, file_desc,
                                   is_extension=True)
         for index, extension in enumerate(desc_proto.extension)]
     oneofs = [
         descriptor.OneofDescriptor(desc.name, '.'.join((desc_name, desc.name)),
-                                   index, None, [])
+                                   index, None, [], desc.options)
         for index, desc in enumerate(desc_proto.oneof_decl)]
     extension_ranges = [(r.start, r.end) for r in desc_proto.extension_range]
     if extension_ranges:
@@ -456,7 +700,7 @@
         nested_types=nested,
         enum_types=enums,
         extensions=extensions,
-        options=desc_proto.options,
+        options=_OptionsOrNone(desc_proto),
         is_extendable=is_extendable,
         extension_ranges=extension_ranges,
         file=file_desc,
@@ -474,6 +718,7 @@
         fields[field_index].containing_oneof = oneofs[oneof_index]
 
     scope[_PrefixWithDot(desc_name)] = desc
+    self._CheckConflictRegister(desc)
     self._descriptors[desc_name] = desc
     return desc
 
@@ -510,13 +755,14 @@
                                      file=file_desc,
                                      values=values,
                                      containing_type=containing_type,
-                                     options=enum_proto.options)
+                                     options=_OptionsOrNone(enum_proto))
     scope['.%s' % enum_name] = desc
+    self._CheckConflictRegister(desc)
     self._enum_descriptors[enum_name] = desc
     return desc
 
   def _MakeFieldDescriptor(self, field_proto, message_name, index,
-                           is_extension=False):
+                           file_desc, is_extension=False):
     """Creates a field descriptor from a FieldDescriptorProto.
 
     For message and enum type fields, this method will do a look up
@@ -529,6 +775,7 @@
       field_proto: The proto describing the field.
       message_name: The name of the containing message.
       index: Index of the field
+      file_desc: The file containing the field descriptor.
       is_extension: Indication that this field is for an extension.
 
     Returns:
@@ -555,7 +802,8 @@
         default_value=None,
         is_extension=is_extension,
         extension_scope=None,
-        options=field_proto.options)
+        options=_OptionsOrNone(field_proto),
+        file=file_desc)
 
   def _SetAllFieldTypes(self, package, desc_proto, scope):
     """Sets all the descriptor's fields's types.
@@ -674,9 +922,69 @@
         name=value_proto.name,
         index=index,
         number=value_proto.number,
-        options=value_proto.options,
+        options=_OptionsOrNone(value_proto),
         type=None)
 
+  def _MakeServiceDescriptor(self, service_proto, service_index, scope,
+                             package, file_desc):
+    """Make a protobuf ServiceDescriptor given a ServiceDescriptorProto.
+
+    Args:
+      service_proto: The descriptor_pb2.ServiceDescriptorProto protobuf message.
+      service_index: The index of the service in the File.
+      scope: Dict mapping short and full symbols to message and enum types.
+      package: Optional package name for the new message EnumDescriptor.
+      file_desc: The file containing the service descriptor.
+
+    Returns:
+      The added descriptor.
+    """
+
+    if package:
+      service_name = '.'.join((package, service_proto.name))
+    else:
+      service_name = service_proto.name
+
+    methods = [self._MakeMethodDescriptor(method_proto, service_name, package,
+                                          scope, index)
+               for index, method_proto in enumerate(service_proto.method)]
+    desc = descriptor.ServiceDescriptor(name=service_proto.name,
+                                        full_name=service_name,
+                                        index=service_index,
+                                        methods=methods,
+                                        options=_OptionsOrNone(service_proto),
+                                        file=file_desc)
+    self._CheckConflictRegister(desc)
+    self._service_descriptors[service_name] = desc
+    return desc
+
+  def _MakeMethodDescriptor(self, method_proto, service_name, package, scope,
+                            index):
+    """Creates a method descriptor from a MethodDescriptorProto.
+
+    Args:
+      method_proto: The proto describing the method.
+      service_name: The name of the containing service.
+      package: Optional package name to look up for types.
+      scope: Scope containing available types.
+      index: Index of the method in the service.
+
+    Returns:
+      An initialized MethodDescriptor object.
+    """
+    full_name = '.'.join((service_name, method_proto.name))
+    input_type = self._GetTypeFromScope(
+        package, method_proto.input_type, scope)
+    output_type = self._GetTypeFromScope(
+        package, method_proto.output_type, scope)
+    return descriptor.MethodDescriptor(name=method_proto.name,
+                                       full_name=full_name,
+                                       index=index,
+                                       containing_service=None,
+                                       input_type=input_type,
+                                       output_type=output_type,
+                                       options=_OptionsOrNone(method_proto))
+
   def _ExtractSymbols(self, descriptors):
     """Pulls out all the symbols from descriptor protos.
 
diff --git a/python/google/protobuf/internal/_parameterized.py b/python/google/protobuf/internal/_parameterized.py
index dea3f19..f2c0b30 100755
--- a/python/google/protobuf/internal/_parameterized.py
+++ b/python/google/protobuf/internal/_parameterized.py
@@ -37,8 +37,8 @@
 
 A simple example:
 
-  class AdditionExample(parameterized.ParameterizedTestCase):
-    @parameterized.Parameters(
+  class AdditionExample(parameterized.TestCase):
+    @parameterized.parameters(
        (1, 2, 3),
        (4, 5, 9),
        (1, 1, 3))
@@ -54,8 +54,8 @@
 Parameters for invididual test cases can be tuples (with positional parameters)
 or dictionaries (with named parameters):
 
-  class AdditionExample(parameterized.ParameterizedTestCase):
-    @parameterized.Parameters(
+  class AdditionExample(parameterized.TestCase):
+    @parameterized.parameters(
        {'op1': 1, 'op2': 2, 'result': 3},
        {'op1': 4, 'op2': 5, 'result': 9},
     )
@@ -77,13 +77,13 @@
   '<__main__.Foo object at 0x23d8610>'
 
 are turned into '<__main__.Foo>'. For even more descriptive names,
-especially in test logs, you can use the NamedParameters decorator. In
+especially in test logs, you can use the named_parameters decorator. In
 this case, only tuples are supported, and the first parameters has to
 be a string (or an object that returns an apt name when converted via
 str()):
 
-  class NamedExample(parameterized.ParameterizedTestCase):
-    @parameterized.NamedParameters(
+  class NamedExample(parameterized.TestCase):
+    @parameterized.named_parameters(
        ('Normal', 'aa', 'aaa', True),
        ('EmptyPrefix', '', 'abc', True),
        ('BothEmpty', '', '', True))
@@ -103,13 +103,13 @@
 Parameterized Classes
 =====================
 If invocation arguments are shared across test methods in a single
-ParameterizedTestCase class, instead of decorating all test methods
+TestCase class, instead of decorating all test methods
 individually, the class itself can be decorated:
 
-  @parameterized.Parameters(
+  @parameterized.parameters(
     (1, 2, 3)
     (4, 5, 9))
-  class ArithmeticTest(parameterized.ParameterizedTestCase):
+  class ArithmeticTest(parameterized.TestCase):
     def testAdd(self, arg1, arg2, result):
       self.assertEqual(arg1 + arg2, result)
 
@@ -122,8 +122,8 @@
 created from other sources, a single non-tuple iterable can be passed into
 the decorator. This iterable will be used to obtain the test cases:
 
-  class AdditionExample(parameterized.ParameterizedTestCase):
-    @parameterized.Parameters(
+  class AdditionExample(parameterized.TestCase):
+    @parameterized.parameters(
       c.op1, c.op2, c.result for c in testcases
     )
     def testAddition(self, op1, op2, result):
@@ -135,8 +135,8 @@
 If a test method takes only one argument, the single argument does not need to
 be wrapped into a tuple:
 
-  class NegativeNumberExample(parameterized.ParameterizedTestCase):
-    @parameterized.Parameters(
+  class NegativeNumberExample(parameterized.TestCase):
+    @parameterized.parameters(
        -1, -3, -4, -5
     )
     def testIsNegative(self, arg):
@@ -212,7 +212,7 @@
   def __call__(self, *args, **kwargs):
     raise RuntimeError('You appear to be running a parameterized test case '
                        'without having inherited from parameterized.'
-                       'ParameterizedTestCase. This is bad because none of '
+                       'TestCase. This is bad because none of '
                        'your test cases are actually being run.')
 
   def __iter__(self):
@@ -306,7 +306,7 @@
   return _Apply
 
 
-def Parameters(*testcases):
+def parameters(*testcases):  # pylint: disable=invalid-name
   """A decorator for creating parameterized tests.
 
   See the module docstring for a usage example.
@@ -321,7 +321,7 @@
   return _ParameterDecorator(_ARGUMENT_REPR, testcases)
 
 
-def NamedParameters(*testcases):
+def named_parameters(*testcases):  # pylint: disable=invalid-name
   """A decorator for creating parameterized tests.
 
   See the module docstring for a usage example. The first element of
@@ -347,8 +347,8 @@
   iterable conforms to the test pattern, the injected methods will be picked
   up as tests by the unittest framework.
 
-  In general, it is supposed to be used in conjuction with the
-  Parameters decorator.
+  In general, it is supposed to be used in conjunction with the
+  parameters decorator.
   """
 
   def __new__(mcs, class_name, bases, dct):
@@ -385,8 +385,8 @@
     id_suffix[new_name] = getattr(func, '__x_extra_id__', '')
 
 
-class ParameterizedTestCase(unittest.TestCase):
-  """Base class for test cases using the Parameters decorator."""
+class TestCase(unittest.TestCase):
+  """Base class for test cases using the parameters decorator."""
   __metaclass__ = TestGeneratorMetaclass
 
   def _OriginalName(self):
@@ -409,10 +409,10 @@
                         self._id_suffix.get(self._testMethodName, ''))
 
 
-def CoopParameterizedTestCase(other_base_class):
+def CoopTestCase(other_base_class):
   """Returns a new base class with a cooperative metaclass base.
 
-  This enables the ParameterizedTestCase to be used in combination
+  This enables the TestCase to be used in combination
   with other base classes that have custom metaclasses, such as
   mox.MoxTestBase.
 
@@ -425,7 +425,7 @@
 
     from google3.testing.pybase import parameterized
 
-    class ExampleTest(parameterized.CoopParameterizedTestCase(mox.MoxTestBase)):
+    class ExampleTest(parameterized.CoopTestCase(mox.MoxTestBase)):
       ...
 
   Args:
@@ -439,5 +439,5 @@
       (other_base_class.__metaclass__,
        TestGeneratorMetaclass), {})
   return metaclass(
-      'CoopParameterizedTestCase',
-      (other_base_class, ParameterizedTestCase), {})
+      'CoopTestCase',
+      (other_base_class, TestCase), {})
diff --git a/python/google/protobuf/internal/any_test.proto b/python/google/protobuf/internal/any_test.proto
index cd641ca..1a563fd 100644
--- a/python/google/protobuf/internal/any_test.proto
+++ b/python/google/protobuf/internal/any_test.proto
@@ -30,13 +30,22 @@
 
 // Author: jieluo@google.com (Jie Luo)
 
-syntax = "proto3";
+syntax = "proto2";
 
 package google.protobuf.internal;
 
 import "google/protobuf/any.proto";
 
 message TestAny {
-  google.protobuf.Any value = 1;
-  int32 int_value = 2;
+  optional google.protobuf.Any value = 1;
+  optional int32 int_value = 2;
+  map<string,int32> map_value = 3;
+  extensions 10 to max;
+}
+
+message TestAnyExtension1 {
+  extend TestAny {
+    optional TestAnyExtension1 extension1 = 98418603;
+  }
+  optional int32 i = 15;
 }
diff --git a/python/google/protobuf/internal/api_implementation.py b/python/google/protobuf/internal/api_implementation.py
index ffcf751..ab9e781 100755
--- a/python/google/protobuf/internal/api_implementation.py
+++ b/python/google/protobuf/internal/api_implementation.py
@@ -32,6 +32,7 @@
 """
 
 import os
+import warnings
 import sys
 
 try:
@@ -60,10 +61,18 @@
     del _use_fast_cpp_protos
     _api_version = 2
   except ImportError:
-    if _proto_extension_modules_exist_in_build:
-      if sys.version_info[0] >= 3:  # Python 3 defaults to C++ impl v2.
-        _api_version = 2
-      # TODO(b/17427486): Make Python 2 default to C++ impl v2.
+    try:
+      # pylint: disable=g-import-not-at-top
+      from google.protobuf.internal import use_pure_python
+      del use_pure_python  # Avoids a pylint error and namespace pollution.
+    except ImportError:
+      # TODO(b/74017912): It's unsafe to enable :use_fast_cpp_protos by default;
+      # it can cause data loss if you have any Python-only extensions to any
+      # message passed back and forth with C++ code.
+      #
+      # TODO(b/17427486): Once that bug is fixed, we want to make both Python 2
+      # and Python 3 default to `_api_version = 2` (C++ implementation V2).
+      pass
 
 _default_implementation_type = (
     'python' if _api_version <= 0 else 'cpp')
@@ -78,6 +87,11 @@
 if _implementation_type != 'python':
   _implementation_type = 'cpp'
 
+if 'PyPy' in sys.version and _implementation_type == 'cpp':
+  warnings.warn('PyPy does not work yet with cpp protocol buffers. '
+                'Falling back to the python implementation.')
+  _implementation_type = 'python'
+
 # This environment variable can be used to switch between the two
 # 'cpp' implementations, overriding the compile-time constants in the
 # _api_implementation module. Right now only '2' is supported. Any other
@@ -94,6 +108,27 @@
 _implementation_version = int(_implementation_version_str)
 
 
+# Detect if serialization should be deterministic by default
+try:
+  # The presence of this module in a build allows the proto implementation to
+  # be upgraded merely via build deps.
+  #
+  # NOTE: Merely importing this automatically enables deterministic proto
+  # serialization for C++ code, but we still need to export it as a boolean so
+  # that we can do the same for `_implementation_type == 'python'`.
+  #
+  # NOTE2: It is possible for C++ code to enable deterministic serialization by
+  # default _without_ affecting Python code, if the C++ implementation is not in
+  # use by this module.  That is intended behavior, so we don't actually expose
+  # this boolean outside of this module.
+  #
+  # pylint: disable=g-import-not-at-top,unused-import
+  from google.protobuf import enable_deterministic_proto_serialization
+  _python_deterministic_proto_serialization = True
+except ImportError:
+  _python_deterministic_proto_serialization = False
+
+
 # Usage of this function is discouraged. Clients shouldn't care which
 # implementation of the API is in use. Note that there is no guarantee
 # that differences between APIs will be maintained.
@@ -105,3 +140,34 @@
 # See comment on 'Type' above.
 def Version():
   return _implementation_version
+
+
+# For internal use only
+def IsPythonDefaultSerializationDeterministic():
+  return _python_deterministic_proto_serialization
+
+# DO NOT USE: For migration and testing only. Will be removed when Proto3
+# defaults to preserve unknowns.
+if _implementation_type == 'cpp':
+  try:
+    # pylint: disable=g-import-not-at-top
+    from google.protobuf.pyext import _message
+
+    def GetPythonProto3PreserveUnknownsDefault():
+      return _message.GetPythonProto3PreserveUnknownsDefault()
+
+    def SetPythonProto3PreserveUnknownsDefault(preserve):
+      _message.SetPythonProto3PreserveUnknownsDefault(preserve)
+  except ImportError:
+    # Unrecognized cpp implementation. Skipping the unknown fields APIs.
+    pass
+else:
+  _python_proto3_preserve_unknowns_default = True
+
+  def GetPythonProto3PreserveUnknownsDefault():
+    return _python_proto3_preserve_unknowns_default
+
+  def SetPythonProto3PreserveUnknownsDefault(preserve):
+    global _python_proto3_preserve_unknowns_default
+    _python_proto3_preserve_unknowns_default = preserve
+
diff --git a/python/google/protobuf/internal/containers.py b/python/google/protobuf/internal/containers.py
index 97cdd84..c6a3692 100755
--- a/python/google/protobuf/internal/containers.py
+++ b/python/google/protobuf/internal/containers.py
@@ -275,7 +275,7 @@
     new_values = [self._type_checker.CheckValue(elem) for elem in elem_seq_iter]
     if new_values:
       self._values.extend(new_values)
-      self._message_listener.Modified()
+    self._message_listener.Modified()
 
   def MergeFrom(self, other):
     """Appends the contents of another repeated field of the same type to this
@@ -436,9 +436,11 @@
   """Simple, type-checked, dict-like container for holding repeated scalars."""
 
   # Disallows assignment to other attributes.
-  __slots__ = ['_key_checker', '_value_checker', '_values', '_message_listener']
+  __slots__ = ['_key_checker', '_value_checker', '_values', '_message_listener',
+               '_entry_descriptor']
 
-  def __init__(self, message_listener, key_checker, value_checker):
+  def __init__(self, message_listener, key_checker, value_checker,
+               entry_descriptor):
     """
     Args:
       message_listener: A MessageListener implementation.
@@ -448,10 +450,12 @@
         inserted into this container.
       value_checker: A type_checkers.ValueChecker instance to run on values
         inserted into this container.
+      entry_descriptor: The MessageDescriptor of a map entry: key and value.
     """
     self._message_listener = message_listener
     self._key_checker = key_checker
     self._value_checker = value_checker
+    self._entry_descriptor = entry_descriptor
     self._values = {}
 
   def __getitem__(self, key):
@@ -513,6 +517,9 @@
     self._values.clear()
     self._message_listener.Modified()
 
+  def GetEntryClass(self):
+    return self._entry_descriptor._concrete_class
+
 
 class MessageMap(MutableMapping):
 
@@ -520,9 +527,10 @@
 
   # Disallows assignment to other attributes.
   __slots__ = ['_key_checker', '_values', '_message_listener',
-               '_message_descriptor']
+               '_message_descriptor', '_entry_descriptor']
 
-  def __init__(self, message_listener, message_descriptor, key_checker):
+  def __init__(self, message_listener, message_descriptor, key_checker,
+               entry_descriptor):
     """
     Args:
       message_listener: A MessageListener implementation.
@@ -532,17 +540,19 @@
         inserted into this container.
       value_checker: A type_checkers.ValueChecker instance to run on values
         inserted into this container.
+      entry_descriptor: The MessageDescriptor of a map entry: key and value.
     """
     self._message_listener = message_listener
     self._message_descriptor = message_descriptor
     self._key_checker = key_checker
+    self._entry_descriptor = entry_descriptor
     self._values = {}
 
   def __getitem__(self, key):
+    key = self._key_checker.CheckValue(key)
     try:
       return self._values[key]
     except KeyError:
-      key = self._key_checker.CheckValue(key)
       new_element = self._message_descriptor._concrete_class()
       new_element._SetListener(self._message_listener)
       self._values[key] = new_element
@@ -574,12 +584,14 @@
       return default
 
   def __contains__(self, item):
+    item = self._key_checker.CheckValue(item)
     return item in self._values
 
   def __setitem__(self, key, value):
     raise ValueError('May not set values directly, call my_map[key].foo = 5')
 
   def __delitem__(self, key):
+    key = self._key_checker.CheckValue(key)
     del self._values[key]
     self._message_listener.Modified()
 
@@ -594,7 +606,11 @@
 
   def MergeFrom(self, other):
     for key in other:
-      self[key].MergeFrom(other[key])
+      # According to documentation: "When parsing from the wire or when merging,
+      # if there are duplicate map keys the last key seen is used".
+      if key in self:
+        del self[key]
+      self[key].CopyFrom(other[key])
     # self._message_listener.Modified() not required here, because
     # mutations to submessages already propagate.
 
@@ -609,3 +625,6 @@
   def clear(self):
     self._values.clear()
     self._message_listener.Modified()
+
+  def GetEntryClass(self):
+    return self._entry_descriptor._concrete_class
diff --git a/python/google/protobuf/internal/decoder.py b/python/google/protobuf/internal/decoder.py
index 31869e4..52b6491 100755
--- a/python/google/protobuf/internal/decoder.py
+++ b/python/google/protobuf/internal/decoder.py
@@ -131,9 +131,12 @@
   return DecodeVarint
 
 
-def _SignedVarintDecoder(mask, result_type):
+def _SignedVarintDecoder(bits, result_type):
   """Like _VarintDecoder() but decodes signed values."""
 
+  signbit = 1 << (bits - 1)
+  mask = (1 << bits) - 1
+
   def DecodeVarint(buffer, pos):
     result = 0
     shift = 0
@@ -142,11 +145,8 @@
       result |= ((b & 0x7f) << shift)
       pos += 1
       if not (b & 0x80):
-        if result > 0x7fffffffffffffff:
-          result -= (1 << 64)
-          result |= ~mask
-        else:
-          result &= mask
+        result &= mask
+        result = (result ^ signbit) - signbit
         result = result_type(result)
         return (result, pos)
       shift += 7
@@ -159,11 +159,11 @@
 # (e.g. the C++ implementation) simpler.
 
 _DecodeVarint = _VarintDecoder((1 << 64) - 1, long)
-_DecodeSignedVarint = _SignedVarintDecoder((1 << 64) - 1, long)
+_DecodeSignedVarint = _SignedVarintDecoder(64, long)
 
 # Use these versions for values which must be limited to 32 bits.
 _DecodeVarint32 = _VarintDecoder((1 << 32) - 1, int)
-_DecodeSignedVarint32 = _SignedVarintDecoder((1 << 32) - 1, int)
+_DecodeSignedVarint32 = _SignedVarintDecoder(32, int)
 
 
 def ReadTag(buffer, pos):
@@ -181,7 +181,7 @@
   while six.indexbytes(buffer, pos) & 0x80:
     pos += 1
   pos += 1
-  return (buffer[start:pos], pos)
+  return (six.binary_type(buffer[start:pos]), pos)
 
 
 # --------------------------------------------------------------------
@@ -642,10 +642,10 @@
 
 MESSAGE_SET_ITEM_TAG = encoder.TagBytes(1, wire_format.WIRETYPE_START_GROUP)
 
-def MessageSetItemDecoder(extensions_by_number):
+def MessageSetItemDecoder(descriptor):
   """Returns a decoder for a MessageSet item.
 
-  The parameter is the _extensions_by_number map for the message class.
+  The parameter is the message Descriptor.
 
   The message set message looks like this:
     message MessageSet {
@@ -694,7 +694,7 @@
     if message_start == -1:
       raise _DecodeError('MessageSet item missing message.')
 
-    extension = extensions_by_number.get(type_id)
+    extension = message.Extensions._FindExtensionByNumber(type_id)
     if extension is not None:
       value = field_dict.get(extension)
       if value is None:
diff --git a/python/google/protobuf/internal/descriptor_database_test.py b/python/google/protobuf/internal/descriptor_database_test.py
index 1baff7d..f97477b 100644
--- a/python/google/protobuf/internal/descriptor_database_test.py
+++ b/python/google/protobuf/internal/descriptor_database_test.py
@@ -35,9 +35,12 @@
 __author__ = 'matthewtoia@google.com (Matt Toia)'
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+import warnings
+
+from google.protobuf import unittest_pb2
 from google.protobuf import descriptor_pb2
 from google.protobuf.internal import factory_test2_pb2
 from google.protobuf import descriptor_database
@@ -53,16 +56,69 @@
 
     self.assertEqual(file_desc_proto, db.FindFileByName(
         'google/protobuf/internal/factory_test2.proto'))
+    # Can find message type.
     self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
         'google.protobuf.python.internal.Factory2Message'))
+    # Can find nested message type.
     self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
         'google.protobuf.python.internal.Factory2Message.NestedFactory2Message'))
+    # Can find enum type.
     self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
         'google.protobuf.python.internal.Factory2Enum'))
+    # Can find nested enum type.
     self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
         'google.protobuf.python.internal.Factory2Message.NestedFactory2Enum'))
     self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
         'google.protobuf.python.internal.MessageWithNestedEnumOnly.NestedEnum'))
+    # Can find field.
+    self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
+        'google.protobuf.python.internal.Factory2Message.list_field'))
+    # Can find enum value.
+    self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
+        'google.protobuf.python.internal.Factory2Enum.FACTORY_2_VALUE_0'))
+    # Can find top level extension.
+    self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
+        'google.protobuf.python.internal.another_field'))
+    # Can find nested extension inside a message.
+    self.assertEqual(file_desc_proto, db.FindFileContainingSymbol(
+        'google.protobuf.python.internal.Factory2Message.one_more_field'))
+
+    # Can find service.
+    file_desc_proto2 = descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb)
+    db.Add(file_desc_proto2)
+    self.assertEqual(file_desc_proto2, db.FindFileContainingSymbol(
+        'protobuf_unittest.TestService'))
+
+    # Non-existent field under a valid top level symbol can also be
+    # found. The behavior is the same with protobuf C++.
+    self.assertEqual(file_desc_proto2, db.FindFileContainingSymbol(
+        'protobuf_unittest.TestAllTypes.none_field'))
+
+    self.assertRaises(KeyError,
+                      db.FindFileContainingSymbol,
+                      'protobuf_unittest.NoneMessage')
+
+  def testConflictRegister(self):
+    db = descriptor_database.DescriptorDatabase()
+    unittest_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb)
+    db.Add(unittest_fd)
+    conflict_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb)
+    conflict_fd.name = 'other_file'
+    with warnings.catch_warnings(record=True) as w:
+      # Cause all warnings to always be triggered.
+      warnings.simplefilter('always')
+      db.Add(conflict_fd)
+      self.assertTrue(len(w))
+      self.assertIs(w[0].category, RuntimeWarning)
+      self.assertIn('Conflict register for file "other_file": ',
+                    str(w[0].message))
+      self.assertIn('already defined in file '
+                    '"google/protobuf/unittest.proto"',
+                    str(w[0].message))
+
 
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/descriptor_pool_test.py b/python/google/protobuf/internal/descriptor_pool_test.py
index f1d6bf9..2cbf781 100644
--- a/python/google/protobuf/internal/descriptor_pool_test.py
+++ b/python/google/protobuf/internal/descriptor_pool_test.py
@@ -34,12 +34,16 @@
 
 __author__ = 'matthewtoia@google.com (Matt Toia)'
 
+import copy
 import os
+import sys
+import warnings
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import unittest_import_pb2
 from google.protobuf import unittest_import_public_pb2
 from google.protobuf import unittest_pb2
@@ -49,7 +53,8 @@
 from google.protobuf.internal import descriptor_pool_test2_pb2
 from google.protobuf.internal import factory_test1_pb2
 from google.protobuf.internal import factory_test2_pb2
-from google.protobuf.internal import test_util
+from google.protobuf.internal import file_options_test_pb2
+from google.protobuf.internal import more_messages_pb2
 from google.protobuf import descriptor
 from google.protobuf import descriptor_database
 from google.protobuf import descriptor_pool
@@ -57,19 +62,8 @@
 from google.protobuf import symbol_database
 
 
-class DescriptorPoolTest(unittest.TestCase):
 
-  def CreatePool(self):
-    return descriptor_pool.DescriptorPool()
-
-  def setUp(self):
-    self.pool = self.CreatePool()
-    self.factory_test1_fd = descriptor_pb2.FileDescriptorProto.FromString(
-        factory_test1_pb2.DESCRIPTOR.serialized_pb)
-    self.factory_test2_fd = descriptor_pb2.FileDescriptorProto.FromString(
-        factory_test2_pb2.DESCRIPTOR.serialized_pb)
-    self.pool.Add(self.factory_test1_fd)
-    self.pool.Add(self.factory_test2_fd)
+class DescriptorPoolTestBase(object):
 
   def testFindFileByName(self):
     name1 = 'google/protobuf/internal/factory_test1.proto'
@@ -107,6 +101,34 @@
     self.assertEqual('google.protobuf.python.internal', file_desc2.package)
     self.assertIn('Factory2Message', file_desc2.message_types_by_name)
 
+    # Tests top level extension.
+    file_desc3 = self.pool.FindFileContainingSymbol(
+        'google.protobuf.python.internal.another_field')
+    self.assertIsInstance(file_desc3, descriptor.FileDescriptor)
+    self.assertEqual('google/protobuf/internal/factory_test2.proto',
+                     file_desc3.name)
+
+    # Tests nested extension inside a message.
+    file_desc4 = self.pool.FindFileContainingSymbol(
+        'google.protobuf.python.internal.Factory2Message.one_more_field')
+    self.assertIsInstance(file_desc4, descriptor.FileDescriptor)
+    self.assertEqual('google/protobuf/internal/factory_test2.proto',
+                     file_desc4.name)
+
+    file_desc5 = self.pool.FindFileContainingSymbol(
+        'protobuf_unittest.TestService')
+    self.assertIsInstance(file_desc5, descriptor.FileDescriptor)
+    self.assertEqual('google/protobuf/unittest.proto',
+                     file_desc5.name)
+
+    # Tests the generated pool.
+    assert descriptor_pool.Default().FindFileContainingSymbol(
+        'google.protobuf.python.internal.Factory2Message.one_more_field')
+    assert descriptor_pool.Default().FindFileContainingSymbol(
+        'google.protobuf.python.internal.another_field')
+    assert descriptor_pool.Default().FindFileContainingSymbol(
+        'protobuf_unittest.TestService')
+
   def testFindFileContainingSymbolFailure(self):
     with self.assertRaises(KeyError):
       self.pool.FindFileContainingSymbol('Does not exist')
@@ -119,6 +141,7 @@
     self.assertEqual('google.protobuf.python.internal.Factory1Message',
                      msg1.full_name)
     self.assertEqual(None, msg1.containing_type)
+    self.assertFalse(msg1.has_options)
 
     nested_msg1 = msg1.nested_types[0]
     self.assertEqual('NestedFactory1Message', nested_msg1.name)
@@ -192,6 +215,27 @@
                        msg2.fields_by_name[name].containing_oneof)
       self.assertIn(msg2.fields_by_name[name], msg2.oneofs[0].fields)
 
+  def testFindTypeErrors(self):
+    self.assertRaises(TypeError, self.pool.FindExtensionByNumber, '')
+
+    # TODO(jieluo): Fix python to raise correct errors.
+    if api_implementation.Type() == 'cpp':
+      self.assertRaises(TypeError, self.pool.FindMethodByName, 0)
+      self.assertRaises(KeyError, self.pool.FindMethodByName, '')
+      error_type = TypeError
+    else:
+      error_type = AttributeError
+    self.assertRaises(error_type, self.pool.FindMessageTypeByName, 0)
+    self.assertRaises(error_type, self.pool.FindFieldByName, 0)
+    self.assertRaises(error_type, self.pool.FindExtensionByName, 0)
+    self.assertRaises(error_type, self.pool.FindEnumTypeByName, 0)
+    self.assertRaises(error_type, self.pool.FindOneofByName, 0)
+    self.assertRaises(error_type, self.pool.FindServiceByName, 0)
+    self.assertRaises(error_type, self.pool.FindFileContainingSymbol, 0)
+    if api_implementation.Type() == 'python':
+      error_type = KeyError
+    self.assertRaises(error_type, self.pool.FindFileByName, 0)
+
   def testFindMessageTypeByNameFailure(self):
     with self.assertRaises(KeyError):
       self.pool.FindMessageTypeByName('Does not exist')
@@ -202,6 +246,7 @@
     self.assertIsInstance(enum1, descriptor.EnumDescriptor)
     self.assertEqual(0, enum1.values_by_name['FACTORY_1_VALUE_0'].number)
     self.assertEqual(1, enum1.values_by_name['FACTORY_1_VALUE_1'].number)
+    self.assertFalse(enum1.has_options)
 
     nested_enum1 = self.pool.FindEnumTypeByName(
         'google.protobuf.python.internal.Factory1Message.NestedFactory1Enum')
@@ -230,14 +275,38 @@
       self.pool.FindEnumTypeByName('Does not exist')
 
   def testFindFieldByName(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # TODO(jieluo): Fix cpp extension to find field correctly
+        # when descriptor pool is using an underlying database.
+        return
     field = self.pool.FindFieldByName(
         'google.protobuf.python.internal.Factory1Message.list_value')
     self.assertEqual(field.name, 'list_value')
     self.assertEqual(field.label, field.LABEL_REPEATED)
+    self.assertFalse(field.has_options)
+
     with self.assertRaises(KeyError):
       self.pool.FindFieldByName('Does not exist')
 
+  def testFindOneofByName(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # TODO(jieluo): Fix cpp extension to find oneof correctly
+        # when descriptor pool is using an underlying database.
+        return
+    oneof = self.pool.FindOneofByName(
+        'google.protobuf.python.internal.Factory2Message.oneof_field')
+    self.assertEqual(oneof.name, 'oneof_field')
+    with self.assertRaises(KeyError):
+      self.pool.FindOneofByName('Does not exist')
+
   def testFindExtensionByName(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # TODO(jieluo): Fix cpp extension to find extension correctly
+        # when descriptor pool is using an underlying database.
+        return
     # An extension defined in a message.
     extension = self.pool.FindExtensionByName(
         'google.protobuf.python.internal.Factory2Message.one_more_field')
@@ -250,6 +319,53 @@
     with self.assertRaises(KeyError):
       self.pool.FindFieldByName('Does not exist')
 
+  def testFindAllExtensions(self):
+    factory1_message = self.pool.FindMessageTypeByName(
+        'google.protobuf.python.internal.Factory1Message')
+    factory2_message = self.pool.FindMessageTypeByName(
+        'google.protobuf.python.internal.Factory2Message')
+    # An extension defined in a message.
+    one_more_field = factory2_message.extensions_by_name['one_more_field']
+    self.pool.AddExtensionDescriptor(one_more_field)
+    # An extension defined at file scope.
+    factory_test2 = self.pool.FindFileByName(
+        'google/protobuf/internal/factory_test2.proto')
+    another_field = factory_test2.extensions_by_name['another_field']
+    self.pool.AddExtensionDescriptor(another_field)
+
+    extensions = self.pool.FindAllExtensions(factory1_message)
+    expected_extension_numbers = set([one_more_field, another_field])
+    self.assertEqual(expected_extension_numbers, set(extensions))
+    # Verify that mutating the returned list does not affect the pool.
+    extensions.append('unexpected_element')
+    # Get the extensions again, the returned value does not contain the
+    # 'unexpected_element'.
+    extensions = self.pool.FindAllExtensions(factory1_message)
+    self.assertEqual(expected_extension_numbers, set(extensions))
+
+  def testFindExtensionByNumber(self):
+    factory1_message = self.pool.FindMessageTypeByName(
+        'google.protobuf.python.internal.Factory1Message')
+    factory2_message = self.pool.FindMessageTypeByName(
+        'google.protobuf.python.internal.Factory2Message')
+    # An extension defined in a message.
+    one_more_field = factory2_message.extensions_by_name['one_more_field']
+    self.pool.AddExtensionDescriptor(one_more_field)
+    # An extension defined at file scope.
+    factory_test2 = self.pool.FindFileByName(
+        'google/protobuf/internal/factory_test2.proto')
+    another_field = factory_test2.extensions_by_name['another_field']
+    self.pool.AddExtensionDescriptor(another_field)
+
+    # An extension defined in a message.
+    extension = self.pool.FindExtensionByNumber(factory1_message, 1001)
+    self.assertEqual(extension.name, 'one_more_field')
+    # An extension defined at file scope.
+    extension = self.pool.FindExtensionByNumber(factory1_message, 1002)
+    self.assertEqual(extension.name, 'another_field')
+    with self.assertRaises(KeyError):
+      extension = self.pool.FindExtensionByNumber(factory1_message, 1234567)
+
   def testExtensionsAreNotFields(self):
     with self.assertRaises(KeyError):
       self.pool.FindFieldByName('google.protobuf.python.internal.another_field')
@@ -260,6 +376,12 @@
       self.pool.FindExtensionByName(
           'google.protobuf.python.internal.Factory1Message.list_value')
 
+  def testFindService(self):
+    service = self.pool.FindServiceByName('protobuf_unittest.TestService')
+    self.assertEqual(service.full_name, 'protobuf_unittest.TestService')
+    with self.assertRaises(KeyError):
+      self.pool.FindServiceByName('Does not exist')
+
   def testUserDefinedDB(self):
     db = descriptor_database.DescriptorDatabase()
     self.pool = descriptor_pool.DescriptorPool(db)
@@ -268,21 +390,17 @@
     self.testFindMessageTypeByName()
 
   def testAddSerializedFile(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # Cpp extension cannot call Add on a DescriptorPool
+        # that uses a DescriptorDatabase.
+        # TODO(jieluo): Fix python and cpp extension diff.
+        return
     self.pool = descriptor_pool.DescriptorPool()
     self.pool.AddSerializedFile(self.factory_test1_fd.SerializeToString())
     self.pool.AddSerializedFile(self.factory_test2_fd.SerializeToString())
     self.testFindMessageTypeByName()
 
-  def testComplexNesting(self):
-    test1_desc = descriptor_pb2.FileDescriptorProto.FromString(
-        descriptor_pool_test1_pb2.DESCRIPTOR.serialized_pb)
-    test2_desc = descriptor_pb2.FileDescriptorProto.FromString(
-        descriptor_pool_test2_pb2.DESCRIPTOR.serialized_pb)
-    self.pool.Add(test1_desc)
-    self.pool.Add(test2_desc)
-    TEST1_FILE.CheckFile(self, self.pool)
-    TEST2_FILE.CheckFile(self, self.pool)
-
 
   def testEnumDefaultValue(self):
     """Test the default value of enums which don't start at zero."""
@@ -301,6 +419,12 @@
     self.assertIs(file_descriptor, descriptor_pool_test1_pb2.DESCRIPTOR)
     _CheckDefaultValue(file_descriptor)
 
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # Cpp extension cannot call Add on a DescriptorPool
+        # that uses a DescriptorDatabase.
+        # TODO(jieluo): Fix python and cpp extension diff.
+        return
     # Then check the dynamic pool and its internal DescriptorDatabase.
     descriptor_proto = descriptor_pb2.FileDescriptorProto.FromString(
         descriptor_pool_test1_pb2.DESCRIPTOR.serialized_pb)
@@ -348,26 +472,166 @@
             unittest_pb2.TestAllTypes.DESCRIPTOR.full_name))
     _CheckDefaultValues(message_class())
 
+  def testAddFileDescriptor(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # Cpp extension cannot call Add on a DescriptorPool
+        # that uses a DescriptorDatabase.
+        # TODO(jieluo): Fix python and cpp extension diff.
+        return
+    file_desc = descriptor_pb2.FileDescriptorProto(name='some/file.proto')
+    self.pool.Add(file_desc)
+    self.pool.AddSerializedFile(file_desc.SerializeToString())
 
-@unittest.skipIf(api_implementation.Type() != 'cpp',
-                            'explicit tests of the C++ implementation')
-class CppDescriptorPoolTest(DescriptorPoolTest):
-  # TODO(amauryfa): remove when descriptor_pool.DescriptorPool() creates true
-  # C++ descriptor pool object for C++ implementation.
+  def testComplexNesting(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # Cpp extension cannot call Add on a DescriptorPool
+        # that uses a DescriptorDatabase.
+        # TODO(jieluo): Fix python and cpp extension diff.
+        return
+    more_messages_desc = descriptor_pb2.FileDescriptorProto.FromString(
+        more_messages_pb2.DESCRIPTOR.serialized_pb)
+    test1_desc = descriptor_pb2.FileDescriptorProto.FromString(
+        descriptor_pool_test1_pb2.DESCRIPTOR.serialized_pb)
+    test2_desc = descriptor_pb2.FileDescriptorProto.FromString(
+        descriptor_pool_test2_pb2.DESCRIPTOR.serialized_pb)
+    self.pool.Add(more_messages_desc)
+    self.pool.Add(test1_desc)
+    self.pool.Add(test2_desc)
+    TEST1_FILE.CheckFile(self, self.pool)
+    TEST2_FILE.CheckFile(self, self.pool)
 
-  def CreatePool(self):
-    # pylint: disable=g-import-not-at-top
-    from google.protobuf.pyext import _message
-    return _message.DescriptorPool()
+  def testConflictRegister(self):
+    if isinstance(self, SecondaryDescriptorFromDescriptorDB):
+      if api_implementation.Type() == 'cpp':
+        # Cpp extension cannot call Add on a DescriptorPool
+        # that uses a DescriptorDatabase.
+        # TODO(jieluo): Fix python and cpp extension diff.
+        return
+    unittest_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb)
+    conflict_fd = copy.deepcopy(unittest_fd)
+    conflict_fd.name = 'other_file'
+    if api_implementation.Type() == 'cpp':
+      try:
+        self.pool.Add(unittest_fd)
+        self.pool.Add(conflict_fd)
+      except TypeError:
+        pass
+    else:
+      with warnings.catch_warnings(record=True) as w:
+        # Cause all warnings to always be triggered.
+        warnings.simplefilter('always')
+        pool = copy.deepcopy(self.pool)
+        # No warnings to add the same descriptors.
+        file_descriptor = unittest_pb2.DESCRIPTOR
+        pool.AddDescriptor(
+            file_descriptor.message_types_by_name['TestAllTypes'])
+        pool.AddEnumDescriptor(
+            file_descriptor.enum_types_by_name['ForeignEnum'])
+        pool.AddServiceDescriptor(
+            file_descriptor.services_by_name['TestService'])
+        pool.AddExtensionDescriptor(
+            file_descriptor.extensions_by_name['optional_int32_extension'])
+        self.assertEqual(len(w), 0)
+        # Check warnings for conflict descriptors with the same name.
+        pool.Add(unittest_fd)
+        pool.Add(conflict_fd)
+        pool.FindFileByName(unittest_fd.name)
+        pool.FindFileByName(conflict_fd.name)
+        self.assertTrue(len(w))
+        self.assertIs(w[0].category, RuntimeWarning)
+        self.assertIn('Conflict register for file "other_file": ',
+                      str(w[0].message))
+        self.assertIn('already defined in file '
+                      '"google/protobuf/unittest.proto"',
+                      str(w[0].message))
+
+
+class DefaultDescriptorPoolTest(DescriptorPoolTestBase, unittest.TestCase):
+
+  def setUp(self):
+    self.pool = descriptor_pool.Default()
+    self.factory_test1_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test1_pb2.DESCRIPTOR.serialized_pb)
+    self.factory_test2_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test2_pb2.DESCRIPTOR.serialized_pb)
+
+  def testFindMethods(self):
+    self.assertIs(
+        self.pool.FindFileByName('google/protobuf/unittest.proto'),
+        unittest_pb2.DESCRIPTOR)
+    self.assertIs(
+        self.pool.FindMessageTypeByName('protobuf_unittest.TestAllTypes'),
+        unittest_pb2.TestAllTypes.DESCRIPTOR)
+    self.assertIs(
+        self.pool.FindFieldByName(
+            'protobuf_unittest.TestAllTypes.optional_int32'),
+        unittest_pb2.TestAllTypes.DESCRIPTOR.fields_by_name['optional_int32'])
+    self.assertIs(
+        self.pool.FindEnumTypeByName('protobuf_unittest.ForeignEnum'),
+        unittest_pb2.ForeignEnum.DESCRIPTOR)
+    self.assertIs(
+        self.pool.FindExtensionByName(
+            'protobuf_unittest.optional_int32_extension'),
+        unittest_pb2.DESCRIPTOR.extensions_by_name['optional_int32_extension'])
+    self.assertIs(
+        self.pool.FindOneofByName('protobuf_unittest.TestAllTypes.oneof_field'),
+        unittest_pb2.TestAllTypes.DESCRIPTOR.oneofs_by_name['oneof_field'])
+    self.assertIs(
+        self.pool.FindServiceByName('protobuf_unittest.TestService'),
+        unittest_pb2.DESCRIPTOR.services_by_name['TestService'])
+
+
+class CreateDescriptorPoolTest(DescriptorPoolTestBase, unittest.TestCase):
+
+  def setUp(self):
+    self.pool = descriptor_pool.DescriptorPool()
+    self.factory_test1_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test1_pb2.DESCRIPTOR.serialized_pb)
+    self.factory_test2_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test2_pb2.DESCRIPTOR.serialized_pb)
+    self.pool.Add(self.factory_test1_fd)
+    self.pool.Add(self.factory_test2_fd)
+
+    self.pool.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_import_public_pb2.DESCRIPTOR.serialized_pb))
+    self.pool.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_import_pb2.DESCRIPTOR.serialized_pb))
+    self.pool.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb))
+
+
+class SecondaryDescriptorFromDescriptorDB(DescriptorPoolTestBase,
+                                          unittest.TestCase):
+
+  def setUp(self):
+    self.factory_test1_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test1_pb2.DESCRIPTOR.serialized_pb)
+    self.factory_test2_fd = descriptor_pb2.FileDescriptorProto.FromString(
+        factory_test2_pb2.DESCRIPTOR.serialized_pb)
+    db = descriptor_database.DescriptorDatabase()
+    db.Add(self.factory_test1_fd)
+    db.Add(self.factory_test2_fd)
+    db.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_import_public_pb2.DESCRIPTOR.serialized_pb))
+    db.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_import_pb2.DESCRIPTOR.serialized_pb))
+    db.Add(descriptor_pb2.FileDescriptorProto.FromString(
+        unittest_pb2.DESCRIPTOR.serialized_pb))
+    self.pool = descriptor_pool.DescriptorPool(descriptor_db=db)
 
 
 class ProtoFile(object):
 
-  def __init__(self, name, package, messages, dependencies=None):
+  def __init__(self, name, package, messages, dependencies=None,
+               public_dependencies=None):
     self.name = name
     self.package = package
     self.messages = messages
     self.dependencies = dependencies or []
+    self.public_dependencies = public_dependencies or []
 
   def CheckFile(self, test, pool):
     file_desc = pool.FindFileByName(self.name)
@@ -375,6 +639,8 @@
     test.assertEqual(self.package, file_desc.package)
     dependencies_names = [f.name for f in file_desc.dependencies]
     test.assertEqual(self.dependencies, dependencies_names)
+    public_dependencies_names = [f.name for f in file_desc.public_dependencies]
+    test.assertEqual(self.public_dependencies, public_dependencies_names)
     for name, msg_type in self.messages.items():
       msg_type.CheckType(test, None, name, file_desc)
 
@@ -426,10 +692,10 @@
       subtype.CheckType(test, desc, name, file_desc)
 
     for index, (name, field) in enumerate(self.field_list):
-      field.CheckField(test, desc, name, index)
+      field.CheckField(test, desc, name, index, file_desc)
 
     for index, (name, field) in enumerate(self.extensions):
-      field.CheckField(test, desc, name, index)
+      field.CheckField(test, desc, name, index, file_desc)
 
 
 class EnumField(object):
@@ -439,7 +705,7 @@
     self.type_name = type_name
     self.default_value = default_value
 
-  def CheckField(self, test, msg_desc, name, index):
+  def CheckField(self, test, msg_desc, name, index, file_desc):
     field_desc = msg_desc.fields_by_name[name]
     enum_desc = msg_desc.enum_types_by_name[self.type_name]
     test.assertEqual(name, field_desc.name)
@@ -453,8 +719,10 @@
     test.assertTrue(field_desc.has_default_value)
     test.assertEqual(enum_desc.values_by_name[self.default_value].number,
                      field_desc.default_value)
+    test.assertFalse(enum_desc.values_by_name[self.default_value].has_options)
     test.assertEqual(msg_desc, field_desc.containing_type)
     test.assertEqual(enum_desc, field_desc.enum_type)
+    test.assertEqual(file_desc, enum_desc.file)
 
 
 class MessageField(object):
@@ -463,7 +731,7 @@
     self.number = number
     self.type_name = type_name
 
-  def CheckField(self, test, msg_desc, name, index):
+  def CheckField(self, test, msg_desc, name, index, file_desc):
     field_desc = msg_desc.fields_by_name[name]
     field_type_desc = msg_desc.nested_types_by_name[self.type_name]
     test.assertEqual(name, field_desc.name)
@@ -477,6 +745,12 @@
     test.assertFalse(field_desc.has_default_value)
     test.assertEqual(msg_desc, field_desc.containing_type)
     test.assertEqual(field_type_desc, field_desc.message_type)
+    test.assertEqual(file_desc, field_desc.file)
+    # TODO(jieluo): Fix python and cpp extension diff for message field
+    # default value.
+    if api_implementation.Type() == 'cpp':
+      test.assertRaises(
+          NotImplementedError, getattr, field_desc, 'default_value')
 
 
 class StringField(object):
@@ -485,7 +759,7 @@
     self.number = number
     self.default_value = default_value
 
-  def CheckField(self, test, msg_desc, name, index):
+  def CheckField(self, test, msg_desc, name, index, file_desc):
     field_desc = msg_desc.fields_by_name[name]
     test.assertEqual(name, field_desc.name)
     expected_field_full_name = '.'.join([msg_desc.full_name, name])
@@ -497,6 +771,7 @@
                      field_desc.cpp_type)
     test.assertTrue(field_desc.has_default_value)
     test.assertEqual(self.default_value, field_desc.default_value)
+    test.assertEqual(file_desc, field_desc.file)
 
 
 class ExtensionField(object):
@@ -505,7 +780,7 @@
     self.number = number
     self.extended_type = extended_type
 
-  def CheckField(self, test, msg_desc, name, index):
+  def CheckField(self, test, msg_desc, name, index, file_desc):
     field_desc = msg_desc.extensions_by_name[name]
     test.assertEqual(name, field_desc.name)
     expected_field_full_name = '.'.join([msg_desc.full_name, name])
@@ -520,6 +795,7 @@
     test.assertEqual(msg_desc, field_desc.extension_scope)
     test.assertEqual(msg_desc, field_desc.message_type)
     test.assertEqual(self.extended_type, field_desc.containing_type.name)
+    test.assertEqual(file_desc, field_desc.file)
 
 
 class AddDescriptorTest(unittest.TestCase):
@@ -555,7 +831,7 @@
             prefix + 'protobuf_unittest.TestAllTypes.NestedMessage').name)
 
   @unittest.skipIf(api_implementation.Type() == 'cpp',
-                    'With the cpp implementation, Add() must be called first')
+                   'With the cpp implementation, Add() must be called first')
   def testMessage(self):
     self._TestMessage('')
     self._TestMessage('.')
@@ -591,13 +867,24 @@
             prefix + 'protobuf_unittest.TestAllTypes.NestedEnum').name)
 
   @unittest.skipIf(api_implementation.Type() == 'cpp',
-                    'With the cpp implementation, Add() must be called first')
+                   'With the cpp implementation, Add() must be called first')
   def testEnum(self):
     self._TestEnum('')
     self._TestEnum('.')
 
   @unittest.skipIf(api_implementation.Type() == 'cpp',
-                    'With the cpp implementation, Add() must be called first')
+                   'With the cpp implementation, Add() must be called first')
+  def testService(self):
+    pool = descriptor_pool.DescriptorPool()
+    with self.assertRaises(KeyError):
+      pool.FindServiceByName('protobuf_unittest.TestService')
+    pool.AddServiceDescriptor(unittest_pb2._TESTSERVICE)
+    self.assertEqual(
+        'protobuf_unittest.TestService',
+        pool.FindServiceByName('protobuf_unittest.TestService').full_name)
+
+  @unittest.skipIf(api_implementation.Type() == 'cpp',
+                   'With the cpp implementation, Add() must be called first')
   def testFile(self):
     pool = descriptor_pool.DescriptorPool()
     pool.AddFileDescriptor(unittest_pb2.DESCRIPTOR)
@@ -612,18 +899,9 @@
       pool.FindFileContainingSymbol(
           'protobuf_unittest.TestAllTypes')
 
-  def _GetDescriptorPoolClass(self):
-    # Test with both implementations of descriptor pools.
-    if api_implementation.Type() == 'cpp':
-      # pylint: disable=g-import-not-at-top
-      from google.protobuf.pyext import _message
-      return _message.DescriptorPool
-    else:
-      return descriptor_pool.DescriptorPool
-
   def testEmptyDescriptorPool(self):
-    # Check that an empty DescriptorPool() contains no message.
-    pool = self._GetDescriptorPoolClass()()
+    # Check that an empty DescriptorPool() contains no messages.
+    pool = descriptor_pool.DescriptorPool()
     proto_file_name = descriptor_pb2.DESCRIPTOR.name
     self.assertRaises(KeyError, pool.FindFileByName, proto_file_name)
     # Add the above file to the pool
@@ -635,7 +913,7 @@
 
   def testCustomDescriptorPool(self):
     # Create a new pool, and add a file descriptor.
-    pool = self._GetDescriptorPoolClass()()
+    pool = descriptor_pool.DescriptorPool()
     file_desc = descriptor_pb2.FileDescriptorProto(
         name='some/file.proto', package='package')
     file_desc.message_type.add(name='Message')
@@ -644,43 +922,55 @@
                      'some/file.proto')
     self.assertEqual(pool.FindMessageTypeByName('package.Message').name,
                      'Message')
+    # Test no package
+    file_proto = descriptor_pb2.FileDescriptorProto(
+        name='some/filename/container.proto')
+    message_proto = file_proto.message_type.add(
+        name='TopMessage')
+    message_proto.field.add(
+        name='bb',
+        number=1,
+        type=descriptor_pb2.FieldDescriptorProto.TYPE_INT32,
+        label=descriptor_pb2.FieldDescriptorProto.LABEL_OPTIONAL)
+    enum_proto = file_proto.enum_type.add(name='TopEnum')
+    enum_proto.value.add(name='FOREIGN_FOO', number=4)
+    file_proto.service.add(name='TopService')
+    pool = descriptor_pool.DescriptorPool()
+    pool.Add(file_proto)
+    self.assertEqual('TopMessage',
+                     pool.FindMessageTypeByName('TopMessage').name)
+    self.assertEqual('TopEnum', pool.FindEnumTypeByName('TopEnum').name)
+    self.assertEqual('TopService', pool.FindServiceByName('TopService').name)
 
+  def testFileDescriptorOptionsWithCustomDescriptorPool(self):
+    # Create a descriptor pool, and add a new FileDescriptorProto to it.
+    pool = descriptor_pool.DescriptorPool()
+    file_name = 'file_descriptor_options_with_custom_descriptor_pool.proto'
+    file_descriptor_proto = descriptor_pb2.FileDescriptorProto(name=file_name)
+    extension_id = file_options_test_pb2.foo_options
+    file_descriptor_proto.options.Extensions[extension_id].foo_name = 'foo'
+    pool.Add(file_descriptor_proto)
+    # The options set on the FileDescriptorProto should be available in the
+    # descriptor even if they contain extensions that cannot be deserialized
+    # using the pool.
+    file_descriptor = pool.FindFileByName(file_name)
+    options = file_descriptor.GetOptions()
+    self.assertEqual('foo', options.Extensions[extension_id].foo_name)
+    # The object returned by GetOptions() is cached.
+    self.assertIs(options, file_descriptor.GetOptions())
 
-@unittest.skipIf(
-    api_implementation.Type() != 'cpp',
-    'default_pool is only supported by the C++ implementation')
-class DefaultPoolTest(unittest.TestCase):
-
-  def testFindMethods(self):
-    # pylint: disable=g-import-not-at-top
-    from google.protobuf.pyext import _message
-    pool = _message.default_pool
-    self.assertIs(
-        pool.FindFileByName('google/protobuf/unittest.proto'),
-        unittest_pb2.DESCRIPTOR)
-    self.assertIs(
-        pool.FindMessageTypeByName('protobuf_unittest.TestAllTypes'),
-        unittest_pb2.TestAllTypes.DESCRIPTOR)
-    self.assertIs(
-        pool.FindFieldByName('protobuf_unittest.TestAllTypes.optional_int32'),
-        unittest_pb2.TestAllTypes.DESCRIPTOR.fields_by_name['optional_int32'])
-    self.assertIs(
-        pool.FindExtensionByName('protobuf_unittest.optional_int32_extension'),
-        unittest_pb2.DESCRIPTOR.extensions_by_name['optional_int32_extension'])
-    self.assertIs(
-        pool.FindEnumTypeByName('protobuf_unittest.ForeignEnum'),
-        unittest_pb2.ForeignEnum.DESCRIPTOR)
-    self.assertIs(
-        pool.FindOneofByName('protobuf_unittest.TestAllTypes.oneof_field'),
-        unittest_pb2.TestAllTypes.DESCRIPTOR.oneofs_by_name['oneof_field'])
-
-  def testAddFileDescriptor(self):
-    # pylint: disable=g-import-not-at-top
-    from google.protobuf.pyext import _message
-    pool = _message.default_pool
-    file_desc = descriptor_pb2.FileDescriptorProto(name='some/file.proto')
-    pool.Add(file_desc)
-    pool.AddSerializedFile(file_desc.SerializeToString())
+  def testAddTypeError(self):
+    pool = descriptor_pool.DescriptorPool()
+    with self.assertRaises(TypeError):
+      pool.AddDescriptor(0)
+    with self.assertRaises(TypeError):
+      pool.AddEnumDescriptor(0)
+    with self.assertRaises(TypeError):
+      pool.AddServiceDescriptor(0)
+    with self.assertRaises(TypeError):
+      pool.AddExtensionDescriptor(0)
+    with self.assertRaises(TypeError):
+      pool.AddFileDescriptor(0)
 
 
 TEST1_FILE = ProtoFile(
@@ -756,7 +1046,9 @@
              ExtensionField(1001, 'DescriptorPoolTest1')),
         ]),
     },
-    dependencies=['google/protobuf/internal/descriptor_pool_test1.proto'])
+    dependencies=['google/protobuf/internal/descriptor_pool_test1.proto',
+                  'google/protobuf/internal/more_messages.proto'],
+    public_dependencies=['google/protobuf/internal/more_messages.proto'])
 
 
 if __name__ == '__main__':
diff --git a/python/google/protobuf/internal/descriptor_pool_test2.proto b/python/google/protobuf/internal/descriptor_pool_test2.proto
index e3fa660..a218ecc 100644
--- a/python/google/protobuf/internal/descriptor_pool_test2.proto
+++ b/python/google/protobuf/internal/descriptor_pool_test2.proto
@@ -33,6 +33,7 @@
 package google.protobuf.python.internal;
 
 import "google/protobuf/internal/descriptor_pool_test1.proto";
+import public "google/protobuf/internal/more_messages.proto";
 
 
 message DescriptorPoolTest3 {
diff --git a/python/google/protobuf/internal/descriptor_test.py b/python/google/protobuf/internal/descriptor_test.py
index fee09a5..02a43d1 100755
--- a/python/google/protobuf/internal/descriptor_test.py
+++ b/python/google/protobuf/internal/descriptor_test.py
@@ -37,9 +37,10 @@
 import sys
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import unittest_custom_options_pb2
 from google.protobuf import unittest_import_pb2
 from google.protobuf import unittest_pb2
@@ -76,27 +77,24 @@
     enum_proto.value.add(name='FOREIGN_BAR', number=5)
     enum_proto.value.add(name='FOREIGN_BAZ', number=6)
 
+    file_proto.message_type.add(name='ResponseMessage')
+    service_proto = file_proto.service.add(
+        name='Service')
+    method_proto = service_proto.method.add(
+        name='CallMethod',
+        input_type='.protobuf_unittest.NestedMessage',
+        output_type='.protobuf_unittest.ResponseMessage')
+
+    # Note: Calling DescriptorPool.Add() multiple times with the same file only
+    # works if the input is canonical; in particular, all type names must be
+    # fully qualified.
     self.pool = self.GetDescriptorPool()
     self.pool.Add(file_proto)
     self.my_file = self.pool.FindFileByName(file_proto.name)
     self.my_message = self.my_file.message_types_by_name[message_proto.name]
     self.my_enum = self.my_message.enum_types_by_name[enum_proto.name]
-
-    self.my_method = descriptor.MethodDescriptor(
-        name='Bar',
-        full_name='protobuf_unittest.TestService.Bar',
-        index=0,
-        containing_service=None,
-        input_type=None,
-        output_type=None)
-    self.my_service = descriptor.ServiceDescriptor(
-        name='TestServiceWithOptions',
-        full_name='protobuf_unittest.TestServiceWithOptions',
-        file=self.my_file,
-        index=0,
-        methods=[
-            self.my_method
-        ])
+    self.my_service = self.my_file.services_by_name[service_proto.name]
+    self.my_method = self.my_service.methods_by_name[method_proto.name]
 
   def GetDescriptorPool(self):
     return symbol_database.Default().pool
@@ -109,6 +107,12 @@
         self.my_message.enum_types_by_name[
             'ForeignEnum'].values_by_number[4].name,
         self.my_message.EnumValueName('ForeignEnum', 4))
+    with self.assertRaises(KeyError):
+      self.my_message.EnumValueName('ForeignEnum', 999)
+    with self.assertRaises(KeyError):
+      self.my_message.EnumValueName('NoneEnum', 999)
+    with self.assertRaises(TypeError):
+      self.my_message.EnumValueName()
 
   def testEnumFixups(self):
     self.assertEqual(self.my_enum, self.my_enum.values[0].type)
@@ -136,15 +140,18 @@
 
   def testSimpleCustomOptions(self):
     file_descriptor = unittest_custom_options_pb2.DESCRIPTOR
-    message_descriptor =\
-        unittest_custom_options_pb2.TestMessageWithCustomOptions.DESCRIPTOR
-    field_descriptor = message_descriptor.fields_by_name["field1"]
-    enum_descriptor = message_descriptor.enum_types_by_name["AnEnum"]
-    enum_value_descriptor =\
-        message_descriptor.enum_values_by_name["ANENUM_VAL2"]
-    service_descriptor =\
-        unittest_custom_options_pb2.TestServiceWithCustomOptions.DESCRIPTOR
-    method_descriptor = service_descriptor.FindMethodByName("Foo")
+    message_descriptor = (unittest_custom_options_pb2.
+                          TestMessageWithCustomOptions.DESCRIPTOR)
+    field_descriptor = message_descriptor.fields_by_name['field1']
+    oneof_descriptor = message_descriptor.oneofs_by_name['AnOneof']
+    enum_descriptor = message_descriptor.enum_types_by_name['AnEnum']
+    enum_value_descriptor = (message_descriptor.
+                             enum_values_by_name['ANENUM_VAL2'])
+    other_enum_value_descriptor = (message_descriptor.
+                                   enum_values_by_name['ANENUM_VAL1'])
+    service_descriptor = (unittest_custom_options_pb2.
+                          TestServiceWithCustomOptions.DESCRIPTOR)
+    method_descriptor = service_descriptor.FindMethodByName('Foo')
 
     file_options = file_descriptor.GetOptions()
     file_opt1 = unittest_custom_options_pb2.file_opt1
@@ -157,6 +164,9 @@
     self.assertEqual(8765432109, field_options.Extensions[field_opt1])
     field_opt2 = unittest_custom_options_pb2.field_opt2
     self.assertEqual(42, field_options.Extensions[field_opt2])
+    oneof_options = oneof_descriptor.GetOptions()
+    oneof_opt1 = unittest_custom_options_pb2.oneof_opt1
+    self.assertEqual(-99, oneof_options.Extensions[oneof_opt1])
     enum_options = enum_descriptor.GetOptions()
     enum_opt1 = unittest_custom_options_pb2.enum_opt1
     self.assertEqual(-789, enum_options.Extensions[enum_opt1])
@@ -176,6 +186,11 @@
         unittest_custom_options_pb2.DummyMessageContainingEnum.DESCRIPTOR)
     self.assertTrue(file_descriptor.has_options)
     self.assertFalse(message_descriptor.has_options)
+    self.assertTrue(field_descriptor.has_options)
+    self.assertTrue(oneof_descriptor.has_options)
+    self.assertTrue(enum_descriptor.has_options)
+    self.assertTrue(enum_value_descriptor.has_options)
+    self.assertFalse(other_enum_value_descriptor.has_options)
 
   def testDifferentCustomOptionTypes(self):
     kint32min = -2**31
@@ -398,6 +413,12 @@
     self.assertEqual(self.my_file.name, 'some/filename/some.proto')
     self.assertEqual(self.my_file.package, 'protobuf_unittest')
     self.assertEqual(self.my_file.pool, self.pool)
+    self.assertFalse(self.my_file.has_options)
+    self.assertEqual('proto2', self.my_file.syntax)
+    file_proto = descriptor_pb2.FileDescriptorProto()
+    self.my_file.CopyToProto(file_proto)
+    self.assertEqual(self.my_file.serialized_pb,
+                     file_proto.SerializeToString())
     # Generated modules also belong to the default pool.
     self.assertEqual(unittest_pb2.DESCRIPTOR.pool, descriptor_pool.Default())
 
@@ -405,13 +426,31 @@
       api_implementation.Type() != 'cpp' or api_implementation.Version() != 2,
       'Immutability of descriptors is only enforced in v2 implementation')
   def testImmutableCppDescriptor(self):
+    file_descriptor = unittest_pb2.DESCRIPTOR
     message_descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
+    field_descriptor = message_descriptor.fields_by_name['optional_int32']
+    enum_descriptor = message_descriptor.enum_types_by_name['NestedEnum']
+    oneof_descriptor = message_descriptor.oneofs_by_name['oneof_field']
     with self.assertRaises(AttributeError):
       message_descriptor.fields_by_name = None
     with self.assertRaises(TypeError):
       message_descriptor.fields_by_name['Another'] = None
     with self.assertRaises(TypeError):
       message_descriptor.fields.append(None)
+    with self.assertRaises(AttributeError):
+      field_descriptor.containing_type = message_descriptor
+    with self.assertRaises(AttributeError):
+      file_descriptor.has_options = False
+    with self.assertRaises(AttributeError):
+      field_descriptor.has_options = False
+    with self.assertRaises(AttributeError):
+      oneof_descriptor.has_options = False
+    with self.assertRaises(AttributeError):
+      enum_descriptor.has_options = False
+    with self.assertRaises(AttributeError) as e:
+      message_descriptor.has_options = True
+    self.assertEqual('attribute is not writable: has_options',
+                     str(e.exception))
 
 
 class NewDescriptorTest(DescriptorTest):
@@ -440,6 +479,12 @@
     self.CheckDescriptorMapping(message_descriptor.fields_by_name)
     self.CheckDescriptorMapping(message_descriptor.fields_by_number)
     self.CheckDescriptorMapping(message_descriptor.fields_by_camelcase_name)
+    self.CheckDescriptorMapping(message_descriptor.enum_types_by_name)
+    self.CheckDescriptorMapping(message_descriptor.enum_values_by_name)
+    self.CheckDescriptorMapping(message_descriptor.oneofs_by_name)
+    self.CheckDescriptorMapping(message_descriptor.enum_types[0].values_by_name)
+    # Test extension range
+    self.assertEqual(message_descriptor.extension_ranges, [])
 
   def CheckFieldDescriptor(self, field_descriptor):
     # Basic properties
@@ -448,6 +493,7 @@
     self.assertEqual(field_descriptor.full_name,
                      'protobuf_unittest.TestAllTypes.optional_int32')
     self.assertEqual(field_descriptor.containing_type.name, 'TestAllTypes')
+    self.assertEqual(field_descriptor.file, unittest_pb2.DESCRIPTOR)
     # Test equality and hashability
     self.assertEqual(field_descriptor, field_descriptor)
     self.assertEqual(
@@ -459,32 +505,73 @@
         field_descriptor)
     self.assertIn(field_descriptor, [field_descriptor])
     self.assertIn(field_descriptor, {field_descriptor: None})
+    self.assertEqual(None, field_descriptor.extension_scope)
+    self.assertEqual(None, field_descriptor.enum_type)
+    if api_implementation.Type() == 'cpp':
+      # For test coverage only
+      self.assertEqual(field_descriptor.id, field_descriptor.id)
 
   def CheckDescriptorSequence(self, sequence):
     # Verifies that a property like 'messageDescriptor.fields' has all the
     # properties of an immutable abc.Sequence.
+    self.assertNotEqual(sequence,
+                        unittest_pb2.TestAllExtensions.DESCRIPTOR.fields)
+    self.assertNotEqual(sequence, [])
+    self.assertNotEqual(sequence, 1)
+    self.assertFalse(sequence == 1)  # Only for cpp test coverage
+    self.assertEqual(sequence, sequence)
+    expected_list = list(sequence)
+    self.assertEqual(expected_list, sequence)
     self.assertGreater(len(sequence), 0)  # Sized
-    self.assertEqual(len(sequence), len(list(sequence)))  # Iterable
+    self.assertEqual(len(sequence), len(expected_list))  # Iterable
+    self.assertEqual(sequence[len(sequence) -1], sequence[-1])
     item = sequence[0]
     self.assertEqual(item, sequence[0])
     self.assertIn(item, sequence)  # Container
     self.assertEqual(sequence.index(item), 0)
     self.assertEqual(sequence.count(item), 1)
+    other_item = unittest_pb2.NestedTestAllTypes.DESCRIPTOR.fields[0]
+    self.assertNotIn(other_item, sequence)
+    self.assertEqual(sequence.count(other_item), 0)
+    self.assertRaises(ValueError, sequence.index, other_item)
+    self.assertRaises(ValueError, sequence.index, [])
     reversed_iterator = reversed(sequence)
     self.assertEqual(list(reversed_iterator), list(sequence)[::-1])
     self.assertRaises(StopIteration, next, reversed_iterator)
+    expected_list[0] = 'change value'
+    self.assertNotEqual(expected_list, sequence)
+    # TODO(jieluo): Change __repr__ support for DescriptorSequence.
+    if api_implementation.Type() == 'python':
+      self.assertEqual(str(list(sequence)), str(sequence))
+    else:
+      self.assertEqual(str(sequence)[0], '<')
 
   def CheckDescriptorMapping(self, mapping):
     # Verifies that a property like 'messageDescriptor.fields' has all the
     # properties of an immutable abc.Mapping.
+    self.assertNotEqual(
+        mapping, unittest_pb2.TestAllExtensions.DESCRIPTOR.fields_by_name)
+    self.assertNotEqual(mapping, {})
+    self.assertNotEqual(mapping, 1)
+    self.assertFalse(mapping == 1)  # Only for cpp test coverage
+    excepted_dict = dict(mapping.items())
+    self.assertEqual(mapping, excepted_dict)
+    self.assertEqual(mapping, mapping)
     self.assertGreater(len(mapping), 0)  # Sized
-    self.assertEqual(len(mapping), len(list(mapping)))  # Iterable
+    self.assertEqual(len(mapping), len(excepted_dict))  # Iterable
     if sys.version_info >= (3,):
       key, item = next(iter(mapping.items()))
     else:
       key, item = mapping.items()[0]
     self.assertIn(key, mapping)  # Container
     self.assertEqual(mapping.get(key), item)
+    with self.assertRaises(TypeError):
+      mapping.get()
+    # TODO(jieluo): Fix python and cpp extension diff.
+    if api_implementation.Type() == 'python':
+      self.assertRaises(TypeError, mapping.get, [])
+    else:
+      self.assertEqual(None, mapping.get([]))
     # keys(), iterkeys() &co
     item = (next(iter(mapping.keys())), next(iter(mapping.values())))
     self.assertEqual(item, next(iter(mapping.items())))
@@ -495,6 +582,18 @@
       CheckItems(mapping.keys(), mapping.iterkeys())
       CheckItems(mapping.values(), mapping.itervalues())
       CheckItems(mapping.items(), mapping.iteritems())
+    excepted_dict[key] = 'change value'
+    self.assertNotEqual(mapping, excepted_dict)
+    del excepted_dict[key]
+    excepted_dict['new_key'] = 'new'
+    self.assertNotEqual(mapping, excepted_dict)
+    self.assertRaises(KeyError, mapping.__getitem__, 'key_error')
+    self.assertRaises(KeyError, mapping.__getitem__, len(mapping) + 1)
+    # TODO(jieluo): Add __repr__ support for DescriptorMapping.
+    if api_implementation.Type() == 'python':
+      self.assertEqual(len(str(dict(mapping.items()))), len(str(mapping)))
+    else:
+      self.assertEqual(str(mapping)[0], '<')
 
   def testDescriptor(self):
     message_descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
@@ -504,13 +603,26 @@
     field_descriptor = message_descriptor.fields_by_camelcase_name[
         'optionalInt32']
     self.CheckFieldDescriptor(field_descriptor)
+    enum_descriptor = unittest_pb2.DESCRIPTOR.enum_types_by_name[
+        'ForeignEnum']
+    self.assertEqual(None, enum_descriptor.containing_type)
+    # Test extension range
+    self.assertEqual(
+        unittest_pb2.TestAllExtensions.DESCRIPTOR.extension_ranges,
+        [(1, 536870912)])
+    self.assertEqual(
+        unittest_pb2.TestMultipleExtensionRanges.DESCRIPTOR.extension_ranges,
+        [(42, 43), (4143, 4244), (65536, 536870912)])
 
   def testCppDescriptorContainer(self):
-    # Check that the collection is still valid even if the parent disappeared.
-    enum = unittest_pb2.TestAllTypes.DESCRIPTOR.enum_types_by_name['NestedEnum']
-    values = enum.values
-    del enum
-    self.assertEqual('FOO', values[0].name)
+    containing_file = unittest_pb2.DESCRIPTOR
+    self.CheckDescriptorSequence(containing_file.dependencies)
+    self.CheckDescriptorMapping(containing_file.message_types_by_name)
+    self.CheckDescriptorMapping(containing_file.enum_types_by_name)
+    self.CheckDescriptorMapping(containing_file.services_by_name)
+    self.CheckDescriptorMapping(containing_file.extensions_by_name)
+    self.CheckDescriptorMapping(
+        unittest_pb2.TestNestedExtension.DESCRIPTOR.extensions_by_name)
 
   def testCppDescriptorContainer_Iterator(self):
     # Same test with the iterator
@@ -519,6 +631,24 @@
     del enum
     self.assertEqual('FOO', next(values_iter).name)
 
+  def testServiceDescriptor(self):
+    service_descriptor = unittest_pb2.DESCRIPTOR.services_by_name['TestService']
+    self.assertEqual(service_descriptor.name, 'TestService')
+    self.assertEqual(service_descriptor.methods[0].name, 'Foo')
+    self.assertIs(service_descriptor.file, unittest_pb2.DESCRIPTOR)
+    self.assertEqual(service_descriptor.index, 0)
+    self.CheckDescriptorMapping(service_descriptor.methods_by_name)
+
+  def testOneofDescriptor(self):
+    message_descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
+    oneof_descriptor = message_descriptor.oneofs_by_name['oneof_field']
+    self.assertFalse(oneof_descriptor.has_options)
+    self.assertEqual(message_descriptor, oneof_descriptor.containing_type)
+    self.assertEqual('oneof_field', oneof_descriptor.name)
+    self.assertEqual('protobuf_unittest.TestAllTypes.oneof_field',
+                     oneof_descriptor.full_name)
+    self.assertEqual(0, oneof_descriptor.index)
+
 
 class DescriptorCopyToProtoTest(unittest.TestCase):
   """Tests for CopyTo functions of Descriptor."""
@@ -596,7 +726,7 @@
       """
 
     self._InternalTestCopyToProto(
-        unittest_pb2._FOREIGNENUM,
+        unittest_pb2.ForeignEnum.DESCRIPTOR,
         descriptor_pb2.EnumDescriptorProto,
         TEST_FOREIGN_ENUM_ASCII)
 
@@ -612,6 +742,19 @@
           deprecated: true
         >
       >
+      field {
+        name: "deprecated_int32_in_oneof"
+        number: 2
+        label: LABEL_OPTIONAL
+        type: TYPE_INT32
+        options {
+          deprecated: true
+        }
+        oneof_index: 0
+      }
+      oneof_decl {
+        name: "oneof_fields"
+      }
       """
 
     self._InternalTestCopyToProto(
@@ -655,49 +798,64 @@
         descriptor_pb2.DescriptorProto,
         TEST_MESSAGE_WITH_SEVERAL_EXTENSIONS_ASCII)
 
-  # Disable this test so we can make changes to the proto file.
-  # TODO(xiaofeng): Enable this test after cl/55530659 is submitted.
-  #
-  # def testCopyToProto_FileDescriptor(self):
-  #   UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII = ("""
-  #     name: 'google/protobuf/unittest_import.proto'
-  #     package: 'protobuf_unittest_import'
-  #     dependency: 'google/protobuf/unittest_import_public.proto'
-  #     message_type: <
-  #       name: 'ImportMessage'
-  #       field: <
-  #         name: 'd'
-  #         number: 1
-  #         label: 1  # Optional
-  #         type: 5  # TYPE_INT32
-  #       >
-  #     >
-  #     """ +
-  #     """enum_type: <
-  #       name: 'ImportEnum'
-  #       value: <
-  #         name: 'IMPORT_FOO'
-  #         number: 7
-  #       >
-  #       value: <
-  #         name: 'IMPORT_BAR'
-  #         number: 8
-  #       >
-  #       value: <
-  #         name: 'IMPORT_BAZ'
-  #         number: 9
-  #       >
-  #     >
-  #     options: <
-  #       java_package: 'com.google.protobuf.test'
-  #       optimize_for: 1  # SPEED
-  #     >
-  #     public_dependency: 0
-  #  """)
-  #  self._InternalTestCopyToProto(
-  #      unittest_import_pb2.DESCRIPTOR,
-  #      descriptor_pb2.FileDescriptorProto,
-  #      UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII)
+  def testCopyToProto_FileDescriptor(self):
+    UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII = ("""
+      name: 'google/protobuf/unittest_import.proto'
+      package: 'protobuf_unittest_import'
+      dependency: 'google/protobuf/unittest_import_public.proto'
+      message_type: <
+        name: 'ImportMessage'
+        field: <
+          name: 'd'
+          number: 1
+          label: 1  # Optional
+          type: 5  # TYPE_INT32
+        >
+      >
+      """ +
+      """enum_type: <
+        name: 'ImportEnum'
+        value: <
+          name: 'IMPORT_FOO'
+          number: 7
+        >
+        value: <
+          name: 'IMPORT_BAR'
+          number: 8
+        >
+        value: <
+          name: 'IMPORT_BAZ'
+          number: 9
+        >
+      >
+      enum_type: <
+        name: 'ImportEnumForMap'
+        value: <
+          name: 'UNKNOWN'
+          number: 0
+        >
+        value: <
+          name: 'FOO'
+          number: 1
+        >
+        value: <
+          name: 'BAR'
+          number: 2
+        >
+      >
+      options: <
+        java_package: 'com.google.protobuf.test'
+        optimize_for: 1  # SPEED
+      """ +
+      """
+        cc_enable_arenas: true
+      >
+      public_dependency: 0
+    """)
+    self._InternalTestCopyToProto(
+        unittest_import_pb2.DESCRIPTOR,
+        descriptor_pb2.FileDescriptorProto,
+        UNITTEST_IMPORT_FILE_DESCRIPTOR_ASCII)
 
   def testCopyToProto_ServiceDescriptor(self):
     TEST_SERVICE_ASCII = """
@@ -713,12 +871,47 @@
         output_type: '.protobuf_unittest.BarResponse'
       >
       """
-    # TODO(rocking): enable this test after the proto descriptor change is
-    # checked in.
-    #self._InternalTestCopyToProto(
-    #    unittest_pb2.TestService.DESCRIPTOR,
-    #    descriptor_pb2.ServiceDescriptorProto,
-    #    TEST_SERVICE_ASCII)
+    self._InternalTestCopyToProto(
+        unittest_pb2.TestService.DESCRIPTOR,
+        descriptor_pb2.ServiceDescriptorProto,
+        TEST_SERVICE_ASCII)
+
+  @unittest.skipIf(
+      api_implementation.Type() == 'python',
+      'It is not implemented in python.')
+  # TODO(jieluo): Add support for pure python or remove in c extension.
+  def testCopyToProto_MethodDescriptor(self):
+    expected_ascii = """
+      name: 'Foo'
+      input_type: '.protobuf_unittest.FooRequest'
+      output_type: '.protobuf_unittest.FooResponse'
+    """
+    method_descriptor = unittest_pb2.TestService.DESCRIPTOR.FindMethodByName(
+        'Foo')
+    self._InternalTestCopyToProto(
+        method_descriptor,
+        descriptor_pb2.MethodDescriptorProto,
+        expected_ascii)
+
+  @unittest.skipIf(
+      api_implementation.Type() == 'python',
+      'Pure python does not raise error.')
+  # TODO(jieluo): Fix pure python to check with the proto type.
+  def testCopyToProto_TypeError(self):
+    file_proto = descriptor_pb2.FileDescriptorProto()
+    self.assertRaises(TypeError,
+                      unittest_pb2.TestEmptyMessage.DESCRIPTOR.CopyToProto,
+                      file_proto)
+    self.assertRaises(TypeError,
+                      unittest_pb2.ForeignEnum.DESCRIPTOR.CopyToProto,
+                      file_proto)
+    self.assertRaises(TypeError,
+                      unittest_pb2.TestService.DESCRIPTOR.CopyToProto,
+                      file_proto)
+    proto = descriptor_pb2.DescriptorProto()
+    self.assertRaises(TypeError,
+                      unittest_import_pb2.DESCRIPTOR.CopyToProto,
+                      proto)
 
 
 class MakeDescriptorTest(unittest.TestCase):
@@ -764,6 +957,11 @@
                      'Foo2.Sub.bar_field')
     self.assertEqual(result.nested_types[0].fields[0].enum_type,
                      result.nested_types[0].enum_types[0])
+    self.assertFalse(result.has_options)
+    self.assertFalse(result.fields[0].has_options)
+    if api_implementation.Type() == 'cpp':
+      with self.assertRaises(AttributeError):
+        result.fields[0].has_options = False
 
   def testMakeDescriptorWithUnsignedIntField(self):
     file_descriptor_proto = descriptor_pb2.FileDescriptorProto()
@@ -816,6 +1014,23 @@
       self.assertEqual(result.fields[index].camelcase_name,
                        camelcase_names[index])
 
+  def testJsonName(self):
+    descriptor_proto = descriptor_pb2.DescriptorProto()
+    descriptor_proto.name = 'TestJsonName'
+    names = ['field_name', 'fieldName', 'FieldName',
+             '_field_name', 'FIELD_NAME', 'json_name']
+    json_names = ['fieldName', 'fieldName', 'FieldName',
+                  'FieldName', 'FIELDNAME', '@type']
+    for index in range(len(names)):
+      field = descriptor_proto.field.add()
+      field.number = index + 1
+      field.name = names[index]
+    field.json_name = '@type'
+    result = descriptor.MakeDescriptor(descriptor_proto)
+    for index in range(len(json_names)):
+      self.assertEqual(result.fields[index].json_name,
+                       json_names[index])
+
 
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/encoder.py b/python/google/protobuf/internal/encoder.py
index 48ef2df..0d1f49d 100755
--- a/python/google/protobuf/internal/encoder.py
+++ b/python/google/protobuf/internal/encoder.py
@@ -340,7 +340,7 @@
 # Map is special: it needs custom logic to compute its size properly.
 
 
-def MapSizer(field_descriptor):
+def MapSizer(field_descriptor, is_message_map):
   """Returns a sizer for a map field."""
 
   # Can't look at field_descriptor.message_type._concrete_class because it may
@@ -355,9 +355,12 @@
       # It's wasteful to create the messages and throw them away one second
       # later since we'll do the same for the actual encode.  But there's not an
       # obvious way to avoid this within the current design without tons of code
-      # duplication.
+      # duplication. For message map, value.ByteSize() should be called to
+      # update the status.
       entry_msg = message_type._concrete_class(key=key, value=value)
       total += message_sizer(entry_msg)
+      if is_message_map:
+        value.ByteSize()
     return total
 
   return FieldSize
@@ -369,7 +372,7 @@
 def _VarintEncoder():
   """Return an encoder for a basic varint value (does not include tag)."""
 
-  def EncodeVarint(write, value):
+  def EncodeVarint(write, value, unused_deterministic=None):
     bits = value & 0x7f
     value >>= 7
     while value:
@@ -385,7 +388,7 @@
   """Return an encoder for a basic signed varint value (does not include
   tag)."""
 
-  def EncodeSignedVarint(write, value):
+  def EncodeSignedVarint(write, value, unused_deterministic=None):
     if value < 0:
       value += (1 << 64)
     bits = value & 0x7f
@@ -408,14 +411,15 @@
   called at startup time so it doesn't need to be fast."""
 
   pieces = []
-  _EncodeVarint(pieces.append, value)
+  _EncodeVarint(pieces.append, value, True)
   return b"".join(pieces)
 
 
 def TagBytes(field_number, wire_type):
   """Encode the given tag and return the bytes.  Only called at startup."""
 
-  return _VarintBytes(wire_format.PackTag(field_number, wire_type))
+  return six.binary_type(
+      _VarintBytes(wire_format.PackTag(field_number, wire_type)))
 
 # --------------------------------------------------------------------
 # As with sizers (see above), we have a number of common encoder
@@ -437,27 +441,27 @@
     if is_packed:
       tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
       local_EncodeVarint = _EncodeVarint
-      def EncodePackedField(write, value):
+      def EncodePackedField(write, value, deterministic):
         write(tag_bytes)
         size = 0
         for element in value:
           size += compute_value_size(element)
-        local_EncodeVarint(write, size)
+        local_EncodeVarint(write, size, deterministic)
         for element in value:
-          encode_value(write, element)
+          encode_value(write, element, deterministic)
       return EncodePackedField
     elif is_repeated:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeRepeatedField(write, value):
+      def EncodeRepeatedField(write, value, deterministic):
         for element in value:
           write(tag_bytes)
-          encode_value(write, element)
+          encode_value(write, element, deterministic)
       return EncodeRepeatedField
     else:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeField(write, value):
+      def EncodeField(write, value, deterministic):
         write(tag_bytes)
-        return encode_value(write, value)
+        return encode_value(write, value, deterministic)
       return EncodeField
 
   return SpecificEncoder
@@ -471,27 +475,27 @@
     if is_packed:
       tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
       local_EncodeVarint = _EncodeVarint
-      def EncodePackedField(write, value):
+      def EncodePackedField(write, value, deterministic):
         write(tag_bytes)
         size = 0
         for element in value:
           size += compute_value_size(modify_value(element))
-        local_EncodeVarint(write, size)
+        local_EncodeVarint(write, size, deterministic)
         for element in value:
-          encode_value(write, modify_value(element))
+          encode_value(write, modify_value(element), deterministic)
       return EncodePackedField
     elif is_repeated:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeRepeatedField(write, value):
+      def EncodeRepeatedField(write, value, deterministic):
         for element in value:
           write(tag_bytes)
-          encode_value(write, modify_value(element))
+          encode_value(write, modify_value(element), deterministic)
       return EncodeRepeatedField
     else:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeField(write, value):
+      def EncodeField(write, value, deterministic):
         write(tag_bytes)
-        return encode_value(write, modify_value(value))
+        return encode_value(write, modify_value(value), deterministic)
       return EncodeField
 
   return SpecificEncoder
@@ -512,22 +516,22 @@
     if is_packed:
       tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
       local_EncodeVarint = _EncodeVarint
-      def EncodePackedField(write, value):
+      def EncodePackedField(write, value, deterministic):
         write(tag_bytes)
-        local_EncodeVarint(write, len(value) * value_size)
+        local_EncodeVarint(write, len(value) * value_size, deterministic)
         for element in value:
           write(local_struct_pack(format, element))
       return EncodePackedField
     elif is_repeated:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeRepeatedField(write, value):
+      def EncodeRepeatedField(write, value, unused_deterministic=None):
         for element in value:
           write(tag_bytes)
           write(local_struct_pack(format, element))
       return EncodeRepeatedField
     else:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeField(write, value):
+      def EncodeField(write, value, unused_deterministic=None):
         write(tag_bytes)
         return write(local_struct_pack(format, value))
       return EncodeField
@@ -578,9 +582,9 @@
     if is_packed:
       tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
       local_EncodeVarint = _EncodeVarint
-      def EncodePackedField(write, value):
+      def EncodePackedField(write, value, deterministic):
         write(tag_bytes)
-        local_EncodeVarint(write, len(value) * value_size)
+        local_EncodeVarint(write, len(value) * value_size, deterministic)
         for element in value:
           # This try/except block is going to be faster than any code that
           # we could write to check whether element is finite.
@@ -591,7 +595,7 @@
       return EncodePackedField
     elif is_repeated:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeRepeatedField(write, value):
+      def EncodeRepeatedField(write, value, unused_deterministic=None):
         for element in value:
           write(tag_bytes)
           try:
@@ -601,7 +605,7 @@
       return EncodeRepeatedField
     else:
       tag_bytes = TagBytes(field_number, wire_type)
-      def EncodeField(write, value):
+      def EncodeField(write, value, unused_deterministic=None):
         write(tag_bytes)
         try:
           write(local_struct_pack(format, value))
@@ -647,9 +651,9 @@
   if is_packed:
     tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
     local_EncodeVarint = _EncodeVarint
-    def EncodePackedField(write, value):
+    def EncodePackedField(write, value, deterministic):
       write(tag_bytes)
-      local_EncodeVarint(write, len(value))
+      local_EncodeVarint(write, len(value), deterministic)
       for element in value:
         if element:
           write(true_byte)
@@ -658,7 +662,7 @@
     return EncodePackedField
   elif is_repeated:
     tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_VARINT)
-    def EncodeRepeatedField(write, value):
+    def EncodeRepeatedField(write, value, unused_deterministic=None):
       for element in value:
         write(tag_bytes)
         if element:
@@ -668,7 +672,7 @@
     return EncodeRepeatedField
   else:
     tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_VARINT)
-    def EncodeField(write, value):
+    def EncodeField(write, value, unused_deterministic=None):
       write(tag_bytes)
       if value:
         return write(true_byte)
@@ -684,18 +688,18 @@
   local_len = len
   assert not is_packed
   if is_repeated:
-    def EncodeRepeatedField(write, value):
+    def EncodeRepeatedField(write, value, deterministic):
       for element in value:
         encoded = element.encode('utf-8')
         write(tag)
-        local_EncodeVarint(write, local_len(encoded))
+        local_EncodeVarint(write, local_len(encoded), deterministic)
         write(encoded)
     return EncodeRepeatedField
   else:
-    def EncodeField(write, value):
+    def EncodeField(write, value, deterministic):
       encoded = value.encode('utf-8')
       write(tag)
-      local_EncodeVarint(write, local_len(encoded))
+      local_EncodeVarint(write, local_len(encoded), deterministic)
       return write(encoded)
     return EncodeField
 
@@ -708,16 +712,16 @@
   local_len = len
   assert not is_packed
   if is_repeated:
-    def EncodeRepeatedField(write, value):
+    def EncodeRepeatedField(write, value, deterministic):
       for element in value:
         write(tag)
-        local_EncodeVarint(write, local_len(element))
+        local_EncodeVarint(write, local_len(element), deterministic)
         write(element)
     return EncodeRepeatedField
   else:
-    def EncodeField(write, value):
+    def EncodeField(write, value, deterministic):
       write(tag)
-      local_EncodeVarint(write, local_len(value))
+      local_EncodeVarint(write, local_len(value), deterministic)
       return write(value)
     return EncodeField
 
@@ -729,16 +733,16 @@
   end_tag = TagBytes(field_number, wire_format.WIRETYPE_END_GROUP)
   assert not is_packed
   if is_repeated:
-    def EncodeRepeatedField(write, value):
+    def EncodeRepeatedField(write, value, deterministic):
       for element in value:
         write(start_tag)
-        element._InternalSerialize(write)
+        element._InternalSerialize(write, deterministic)
         write(end_tag)
     return EncodeRepeatedField
   else:
-    def EncodeField(write, value):
+    def EncodeField(write, value, deterministic):
       write(start_tag)
-      value._InternalSerialize(write)
+      value._InternalSerialize(write, deterministic)
       return write(end_tag)
     return EncodeField
 
@@ -750,17 +754,17 @@
   local_EncodeVarint = _EncodeVarint
   assert not is_packed
   if is_repeated:
-    def EncodeRepeatedField(write, value):
+    def EncodeRepeatedField(write, value, deterministic):
       for element in value:
         write(tag)
-        local_EncodeVarint(write, element.ByteSize())
-        element._InternalSerialize(write)
+        local_EncodeVarint(write, element.ByteSize(), deterministic)
+        element._InternalSerialize(write, deterministic)
     return EncodeRepeatedField
   else:
-    def EncodeField(write, value):
+    def EncodeField(write, value, deterministic):
       write(tag)
-      local_EncodeVarint(write, value.ByteSize())
-      return value._InternalSerialize(write)
+      local_EncodeVarint(write, value.ByteSize(), deterministic)
+      return value._InternalSerialize(write, deterministic)
     return EncodeField
 
 
@@ -787,10 +791,10 @@
   end_bytes = TagBytes(1, wire_format.WIRETYPE_END_GROUP)
   local_EncodeVarint = _EncodeVarint
 
-  def EncodeField(write, value):
+  def EncodeField(write, value, deterministic):
     write(start_bytes)
-    local_EncodeVarint(write, value.ByteSize())
-    value._InternalSerialize(write)
+    local_EncodeVarint(write, value.ByteSize(), deterministic)
+    value._InternalSerialize(write, deterministic)
     return write(end_bytes)
 
   return EncodeField
@@ -815,9 +819,10 @@
   message_type = field_descriptor.message_type
   encode_message = MessageEncoder(field_descriptor.number, False, False)
 
-  def EncodeField(write, value):
-    for key in value:
+  def EncodeField(write, value, deterministic):
+    value_keys = sorted(value.keys()) if deterministic else value
+    for key in value_keys:
       entry_msg = message_type._concrete_class(key=key, value=value[key])
-      encode_message(write, entry_msg)
+      encode_message(write, entry_msg, deterministic)
 
   return EncodeField
diff --git a/python/google/protobuf/internal/factory_test2.proto b/python/google/protobuf/internal/factory_test2.proto
index bb1b54a..5fcbc5a 100644
--- a/python/google/protobuf/internal/factory_test2.proto
+++ b/python/google/protobuf/internal/factory_test2.proto
@@ -97,3 +97,8 @@
 extend Factory1Message {
   optional string another_field = 1002;
 }
+
+message MessageWithOption {
+  option no_standard_descriptor_accessor = true;
+  optional int32 field1 = 1;
+}
diff --git a/python/google/protobuf/internal/file_options_test.proto b/python/google/protobuf/internal/file_options_test.proto
new file mode 100644
index 0000000..4eceeb0
--- /dev/null
+++ b/python/google/protobuf/internal/file_options_test.proto
@@ -0,0 +1,43 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+syntax = "proto2";
+
+import "google/protobuf/descriptor.proto";
+
+package google.protobuf.python.internal;
+
+message FooOptions {
+  optional string foo_name = 1;
+}
+
+extend .google.protobuf.FileOptions {
+  optional FooOptions foo_options = 120436268;
+}
diff --git a/python/google/protobuf/internal/generator_test.py b/python/google/protobuf/internal/generator_test.py
index 9956da5..7f13f9d 100755
--- a/python/google/protobuf/internal/generator_test.py
+++ b/python/google/protobuf/internal/generator_test.py
@@ -42,9 +42,10 @@
 __author__ = 'robinson@google.com (Will Robinson)'
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf.internal import test_bad_identifiers_pb2
 from google.protobuf import unittest_custom_options_pb2
 from google.protobuf import unittest_import_pb2
@@ -226,7 +227,8 @@
                      [unittest_import_pb2.DESCRIPTOR])
     self.assertEqual(unittest_import_pb2.DESCRIPTOR.dependencies,
                      [unittest_import_public_pb2.DESCRIPTOR])
-
+    self.assertEqual(unittest_import_pb2.DESCRIPTOR.public_dependencies,
+                     [unittest_import_public_pb2.DESCRIPTOR])
   def testNoGenericServices(self):
     self.assertTrue(hasattr(unittest_no_generic_services_pb2, "TestMessage"))
     self.assertTrue(hasattr(unittest_no_generic_services_pb2, "FOO"))
diff --git a/python/google/protobuf/internal/json_format_test.py b/python/google/protobuf/internal/json_format_test.py
index 49e96a4..d891dce 100644
--- a/python/google/protobuf/internal/json_format_test.py
+++ b/python/google/protobuf/internal/json_format_test.py
@@ -39,15 +39,18 @@
 import sys
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import any_pb2
 from google.protobuf import duration_pb2
 from google.protobuf import field_mask_pb2
 from google.protobuf import struct_pb2
 from google.protobuf import timestamp_pb2
 from google.protobuf import wrappers_pb2
+from google.protobuf import unittest_mset_pb2
+from google.protobuf import unittest_pb2
 from google.protobuf.internal import well_known_types
 from google.protobuf import json_format
 from google.protobuf.util import json_format_proto3_pb2
@@ -157,6 +160,98 @@
     json_format.Parse(text, parsed_message)
     self.assertEqual(message, parsed_message)
 
+  def testUnknownEnumToJsonAndBack(self):
+    text = '{\n  "enumValue": 999\n}'
+    message = json_format_proto3_pb2.TestMessage()
+    message.enum_value = 999
+    self.assertEqual(json_format.MessageToJson(message),
+                     text)
+    parsed_message = json_format_proto3_pb2.TestMessage()
+    json_format.Parse(text, parsed_message)
+    self.assertEqual(message, parsed_message)
+
+  def testExtensionToJsonAndBack(self):
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    message_text = json_format.MessageToJson(
+        message
+    )
+    parsed_message = unittest_mset_pb2.TestMessageSetContainer()
+    json_format.Parse(message_text, parsed_message)
+    self.assertEqual(message, parsed_message)
+
+  def testExtensionErrors(self):
+    self.CheckError('{"[extensionField]": {}}',
+                    'Message type proto3.TestMessage does not have extensions')
+
+  def testExtensionToDictAndBack(self):
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    message_dict = json_format.MessageToDict(
+        message
+    )
+    parsed_message = unittest_mset_pb2.TestMessageSetContainer()
+    json_format.ParseDict(message_dict, parsed_message)
+    self.assertEqual(message, parsed_message)
+
+  def testExtensionSerializationDictMatchesProto3Spec(self):
+    """See go/proto3-json-spec for spec.
+    """
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    message_dict = json_format.MessageToDict(
+        message
+    )
+    golden_dict = {
+        'messageSet': {
+            '[protobuf_unittest.'
+            'TestMessageSetExtension1.messageSetExtension]': {
+                'i': 23,
+            },
+            '[protobuf_unittest.'
+            'TestMessageSetExtension2.messageSetExtension]': {
+                'str': u'foo',
+            },
+        },
+    }
+    self.assertEqual(golden_dict, message_dict)
+
+
+  def testExtensionSerializationJsonMatchesProto3Spec(self):
+    """See go/proto3-json-spec for spec.
+    """
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    message_text = json_format.MessageToJson(
+        message
+    )
+    ext1_text = ('protobuf_unittest.TestMessageSetExtension1.'
+                 'messageSetExtension')
+    ext2_text = ('protobuf_unittest.TestMessageSetExtension2.'
+                 'messageSetExtension')
+    golden_text = ('{"messageSet": {'
+                   '    "[%s]": {'
+                   '        "i": 23'
+                   '    },'
+                   '    "[%s]": {'
+                   '        "str": "foo"'
+                   '    }'
+                   '}}') % (ext1_text, ext2_text)
+    self.assertEqual(json.loads(golden_text), json.loads(message_text))
+
+
   def testJsonEscapeString(self):
     message = json_format_proto3_pb2.TestMessage()
     if sys.version_info[0] < 3:
@@ -204,8 +299,28 @@
     parsed_message = json_format_proto3_pb2.TestMessage()
     self.CheckParseBack(message, parsed_message)
 
+  def testIntegersRepresentedAsFloat(self):
+    message = json_format_proto3_pb2.TestMessage()
+    json_format.Parse('{"int32Value": -2.147483648e9}', message)
+    self.assertEqual(message.int32_value, -2147483648)
+    json_format.Parse('{"int32Value": 1e5}', message)
+    self.assertEqual(message.int32_value, 100000)
+    json_format.Parse('{"int32Value": 1.0}', message)
+    self.assertEqual(message.int32_value, 1)
+
   def testMapFields(self):
-    message = json_format_proto3_pb2.TestMap()
+    message = json_format_proto3_pb2.TestNestedMap()
+    self.assertEqual(
+        json.loads(json_format.MessageToJson(message, True)),
+        json.loads('{'
+                   '"boolMap": {},'
+                   '"int32Map": {},'
+                   '"int64Map": {},'
+                   '"uint32Map": {},'
+                   '"uint64Map": {},'
+                   '"stringMap": {},'
+                   '"mapMap": {}'
+                   '}'))
     message.bool_map[True] = 1
     message.bool_map[False] = 2
     message.int32_map[1] = 2
@@ -218,17 +333,19 @@
     message.uint64_map[2] = 3
     message.string_map['1'] = 2
     message.string_map['null'] = 3
+    message.map_map['1'].bool_map[True] = 3
     self.assertEqual(
-        json.loads(json_format.MessageToJson(message, True)),
+        json.loads(json_format.MessageToJson(message, False)),
         json.loads('{'
                    '"boolMap": {"false": 2, "true": 1},'
                    '"int32Map": {"1": 2, "2": 3},'
                    '"int64Map": {"1": 2, "2": 3},'
                    '"uint32Map": {"1": 2, "2": 3},'
                    '"uint64Map": {"1": 2, "2": 3},'
-                   '"stringMap": {"1": 2, "null": 3}'
+                   '"stringMap": {"1": 2, "null": 3},'
+                   '"mapMap": {"1": {"boolMap": {"true": 3}}}'
                    '}'))
-    parsed_message = json_format_proto3_pb2.TestMap()
+    parsed_message = json_format_proto3_pb2.TestNestedMap()
     self.CheckParseBack(message, parsed_message)
 
   def testOneofFields(self):
@@ -246,6 +363,23 @@
     parsed_message = json_format_proto3_pb2.TestOneof()
     self.CheckParseBack(message, parsed_message)
 
+  def testSurrogates(self):
+    # Test correct surrogate handling.
+    message = json_format_proto3_pb2.TestMessage()
+    json_format.Parse('{"stringValue": "\\uD83D\\uDE01"}', message)
+    self.assertEqual(message.string_value,
+                     b'\xF0\x9F\x98\x81'.decode('utf-8', 'strict'))
+
+    # Error case: unpaired high surrogate.
+    self.CheckError(
+        '{"stringValue": "\\uD83D"}',
+        r'Invalid \\uXXXX escape|Unpaired.*surrogate')
+
+    # Unpaired low surrogate.
+    self.CheckError(
+        '{"stringValue": "\\uDE01"}',
+        r'Invalid \\uXXXX escape|Unpaired.*surrogate')
+
   def testTimestampMessage(self):
     message = json_format_proto3_pb2.TestTimestamp()
     message.value.seconds = 0
@@ -410,6 +544,9 @@
             '  "value": "hello",'
             '  "repeatedValue": [11.1, false, null, null]'
             '}'))
+    message.Clear()
+    json_format.Parse('{"value": null}', message)
+    self.assertEqual(message.value.WhichOneof('kind'), 'null_value')
 
   def testListValueMessage(self):
     message = json_format_proto3_pb2.TestListValue()
@@ -457,6 +594,22 @@
             '}\n'))
     parsed_message = json_format_proto3_pb2.TestAny()
     self.CheckParseBack(message, parsed_message)
+    # Must print @type first
+    test_message = json_format_proto3_pb2.TestMessage(
+        bool_value=True,
+        int32_value=20,
+        int64_value=-20,
+        uint32_value=20,
+        uint64_value=20,
+        double_value=3.14,
+        string_value='foo')
+    message.Clear()
+    message.value.Pack(test_message)
+    self.assertEqual(
+        json_format.MessageToJson(message, False)[0:68],
+        '{\n'
+        '  "value": {\n'
+        '    "@type": "type.googleapis.com/proto3.TestMessage"')
 
   def testWellKnownInAnyMessage(self):
     message = any_pb2.Any()
@@ -566,6 +719,11 @@
                       '}',
                       parsed_message)
     self.assertEqual(message, parsed_message)
+    # Null and {} should have different behavior for sub message.
+    self.assertFalse(parsed_message.HasField('message_value'))
+    json_format.Parse('{"messageValue": {}}', parsed_message)
+    self.assertTrue(parsed_message.HasField('message_value'))
+    # Null is not allowed to be used as an element in repeated field.
     self.assertRaisesRegexp(
         json_format.ParseError,
         'Failed to parse repeatedInt32Value field: '
@@ -573,6 +731,9 @@
         json_format.Parse,
         '{"repeatedInt32Value":[1, null]}',
         parsed_message)
+    self.CheckError('{"repeatedMessageValue":[null]}',
+                    'Failed to parse repeatedMessageValue field: null is not'
+                    ' allowed to be used as an element in a repeated field.')
 
   def testNanFloat(self):
     message = json_format_proto3_pb2.TestMessage()
@@ -587,15 +748,26 @@
     self.CheckError('',
                     r'Failed to load JSON: (Expecting value)|(No JSON).')
 
-  def testParseBadEnumValue(self):
-    self.CheckError(
-        '{"enumValue": 1}',
-        'Enum value must be a string literal with double quotes. '
-        'Type "proto3.EnumType" has no value named 1.')
+  def testParseEnumValue(self):
+    message = json_format_proto3_pb2.TestMessage()
+    text = '{"enumValue": 0}'
+    json_format.Parse(text, message)
+    text = '{"enumValue": 1}'
+    json_format.Parse(text, message)
     self.CheckError(
         '{"enumValue": "baz"}',
-        'Enum value must be a string literal with double quotes. '
-        'Type "proto3.EnumType" has no value named baz.')
+        'Failed to parse enumValue field: Invalid enum value baz '
+        'for enum type proto3.EnumType.')
+    # Proto3 accepts numeric unknown enums.
+    text = '{"enumValue": 12345}'
+    json_format.Parse(text, message)
+    # Proto2 does not accept unknown enums.
+    message = unittest_pb2.TestAllTypes()
+    self.assertRaisesRegexp(
+        json_format.ParseError,
+        'Failed to parse optionalNestedEnum field: Invalid enum value 12345 '
+        'for enum type protobuf_unittest.TestAllTypes.NestedEnum.',
+        json_format.Parse, '{"optionalNestedEnum": 12345}', message)
 
   def testParseBadIdentifer(self):
     self.CheckError('{int32Value: 1}',
@@ -605,6 +777,19 @@
                     'Message type "proto3.TestMessage" has no field named '
                     '"unknownName".')
 
+  def testIgnoreUnknownField(self):
+    text = '{"unknownName": 1}'
+    parsed_message = json_format_proto3_pb2.TestMessage()
+    json_format.Parse(text, parsed_message, ignore_unknown_fields=True)
+    text = ('{\n'
+            '  "repeatedValue": [ {\n'
+            '    "@type": "type.googleapis.com/proto3.MessageType",\n'
+            '    "unknownName": 1\n'
+            '  }]\n'
+            '}\n')
+    parsed_message = json_format_proto3_pb2.TestAny()
+    json_format.Parse(text, parsed_message, ignore_unknown_fields=True)
+
   def testDuplicateField(self):
     # Duplicate key check is not supported for python2.6
     if sys.version_info < (2, 7):
@@ -625,12 +810,12 @@
     text = '{"int32Value": 0x12345}'
     self.assertRaises(json_format.ParseError,
                       json_format.Parse, text, message)
+    self.CheckError('{"int32Value": 1.5}',
+                    'Failed to parse int32Value field: '
+                    'Couldn\'t parse integer: 1.5.')
     self.CheckError('{"int32Value": 012345}',
                     (r'Failed to load JSON: Expecting \'?,\'? delimiter: '
                      r'line 1.'))
-    self.CheckError('{"int32Value": 1.0}',
-                    'Failed to parse int32Value field: '
-                    'Couldn\'t parse integer: 1.0.')
     self.CheckError('{"int32Value": " 1 "}',
                     'Failed to parse int32Value field: '
                     'Couldn\'t parse integer: " 1 ".')
@@ -640,9 +825,6 @@
     self.CheckError('{"int32Value": 12345678901234567890}',
                     'Failed to parse int32Value field: Value out of range: '
                     '12345678901234567890.')
-    self.CheckError('{"int32Value": 1e5}',
-                    'Failed to parse int32Value field: '
-                    'Couldn\'t parse integer: 100000.0.')
     self.CheckError('{"uint32Value": -1}',
                     'Failed to parse uint32Value field: '
                     'Value out of range: -1.')
@@ -658,6 +840,11 @@
     self.CheckError('{"bytesValue": "AQI*"}',
                     'Failed to parse bytesValue field: Incorrect padding.')
 
+  def testInvalidRepeated(self):
+    self.CheckError('{"repeatedInt32Value": 12345}',
+                    (r'Failed to parse repeatedInt32Value field: repeated field'
+                     r' repeatedInt32Value must be in \[\] which is 12345.'))
+
   def testInvalidMap(self):
     message = json_format_proto3_pb2.TestMap()
     text = '{"int32Map": {"null": 2, "2": 3}}'
@@ -683,6 +870,12 @@
         json_format.ParseError,
         'Failed to load JSON: duplicate key a',
         json_format.Parse, text, message)
+    text = r'{"stringMap": 0}'
+    self.assertRaisesRegexp(
+        json_format.ParseError,
+        'Failed to parse stringMap field: Map field string_map must be '
+        'in a dict which is 0.',
+        json_format.Parse, text, message)
 
   def testInvalidTimestamp(self):
     message = json_format_proto3_pb2.TestTimestamp()
@@ -706,7 +899,7 @@
     text = '{"value": "0000-01-01T00:00:00Z"}'
     self.assertRaisesRegexp(
         json_format.ParseError,
-        'Failed to parse value field: year is out of range.',
+        'Failed to parse value field: year (0 )?is out of range.',
         json_format.Parse, text, message)
     # Time bigger than maxinum time.
     message.value.seconds = 253402300800
@@ -758,11 +951,88 @@
         'Can not find message descriptor by type_url: '
         'type.googleapis.com/MessageNotExist.',
         json_format.Parse, text, message)
-    # Only last part is to be used.
+    # Only last part is to be used: b/25630112
     text = (r'{"@type": "incorrect.googleapis.com/google.protobuf.Int32Value",'
             r'"value": 1234}')
     json_format.Parse(text, message)
 
+  def testPreservingProtoFieldNames(self):
+    message = json_format_proto3_pb2.TestMessage()
+    message.int32_value = 12345
+    self.assertEqual('{\n  "int32Value": 12345\n}',
+                     json_format.MessageToJson(message))
+    self.assertEqual('{\n  "int32_value": 12345\n}',
+                     json_format.MessageToJson(message, False, True))
+    # When including_default_value_fields is True.
+    message = json_format_proto3_pb2.TestTimestamp()
+    self.assertEqual('{\n  "repeatedValue": []\n}',
+                     json_format.MessageToJson(message, True, False))
+    self.assertEqual('{\n  "repeated_value": []\n}',
+                     json_format.MessageToJson(message, True, True))
+
+    # Parsers accept both original proto field names and lowerCamelCase names.
+    message = json_format_proto3_pb2.TestMessage()
+    json_format.Parse('{"int32Value": 54321}', message)
+    self.assertEqual(54321, message.int32_value)
+    json_format.Parse('{"int32_value": 12345}', message)
+    self.assertEqual(12345, message.int32_value)
+
+  def testIndent(self):
+    message = json_format_proto3_pb2.TestMessage()
+    message.int32_value = 12345
+    self.assertEqual('{\n"int32Value": 12345\n}',
+                     json_format.MessageToJson(message, indent=0))
+
+  def testFormatEnumsAsInts(self):
+    message = json_format_proto3_pb2.TestMessage()
+    message.enum_value = json_format_proto3_pb2.BAR
+    message.repeated_enum_value.append(json_format_proto3_pb2.FOO)
+    message.repeated_enum_value.append(json_format_proto3_pb2.BAR)
+    self.assertEqual(json.loads('{\n'
+                                '  "enumValue": 1,\n'
+                                '  "repeatedEnumValue": [0, 1]\n'
+                                '}\n'),
+                     json.loads(json_format.MessageToJson(
+                         message, use_integers_for_enums=True)))
+
+  def testParseDict(self):
+    expected = 12345
+    js_dict = {'int32Value': expected}
+    message = json_format_proto3_pb2.TestMessage()
+    json_format.ParseDict(js_dict, message)
+    self.assertEqual(expected, message.int32_value)
+
+  def testMessageToDict(self):
+    message = json_format_proto3_pb2.TestMessage()
+    message.int32_value = 12345
+    expected = {'int32Value': 12345}
+    self.assertEqual(expected,
+                     json_format.MessageToDict(message))
+
+  def testJsonName(self):
+    message = json_format_proto3_pb2.TestCustomJsonName()
+    message.value = 12345
+    self.assertEqual('{\n  "@value": 12345\n}',
+                     json_format.MessageToJson(message))
+    parsed_message = json_format_proto3_pb2.TestCustomJsonName()
+    self.CheckParseBack(message, parsed_message)
+
+  def testSortKeys(self):
+    # Testing sort_keys is not perfectly working, as by random luck we could
+    # get the output sorted. We just use a selection of names.
+    message = json_format_proto3_pb2.TestMessage(bool_value=True,
+                                                 int32_value=1,
+                                                 int64_value=3,
+                                                 uint32_value=4,
+                                                 string_value='bla')
+    self.assertEqual(
+        json_format.MessageToJson(message, sort_keys=True),
+        # We use json.dumps() instead of a hardcoded string due to differences
+        # between Python 2 and Python 3.
+        json.dumps({'boolValue': True, 'int32Value': 1, 'int64Value': '3',
+                    'uint32Value': 4, 'stringValue': 'bla'},
+                   indent=2, sort_keys=True))
+
 
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/message_factory_test.py b/python/google/protobuf/internal/message_factory_test.py
index 2fbe5ea..6df52ed 100644
--- a/python/google/protobuf/internal/message_factory_test.py
+++ b/python/google/protobuf/internal/message_factory_test.py
@@ -35,10 +35,12 @@
 __author__ = 'matthewtoia@google.com (Matt Toia)'
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import descriptor_pb2
+from google.protobuf.internal import api_implementation
 from google.protobuf.internal import factory_test1_pb2
 from google.protobuf.internal import factory_test2_pb2
 from google.protobuf import descriptor_database
@@ -105,30 +107,115 @@
   def testGetMessages(self):
     # performed twice because multiple calls with the same input must be allowed
     for _ in range(2):
-      messages = message_factory.GetMessages([self.factory_test1_fd,
-                                              self.factory_test2_fd])
+      # GetMessage should work regardless of the order the FileDescriptorProto
+      # are provided. In particular, the function should succeed when the files
+      # are not in the topological order of dependencies.
+
+      # Assuming factory_test2_fd depends on factory_test1_fd.
+      self.assertIn(self.factory_test1_fd.name,
+                    self.factory_test2_fd.dependency)
+      # Get messages should work when a file comes before its dependencies:
+      # factory_test2_fd comes before factory_test1_fd.
+      messages = message_factory.GetMessages([self.factory_test2_fd,
+                                              self.factory_test1_fd])
       self.assertTrue(
           set(['google.protobuf.python.internal.Factory2Message',
                'google.protobuf.python.internal.Factory1Message'],
              ).issubset(set(messages.keys())))
       self._ExerciseDynamicClass(
           messages['google.protobuf.python.internal.Factory2Message'])
-      self.assertTrue(
-          set(['google.protobuf.python.internal.Factory2Message.one_more_field',
-               'google.protobuf.python.internal.another_field'],
-             ).issubset(
-                 set(messages['google.protobuf.python.internal.Factory1Message']
-                     ._extensions_by_name.keys())))
       factory_msg1 = messages['google.protobuf.python.internal.Factory1Message']
+      self.assertTrue(set(
+          ['google.protobuf.python.internal.Factory2Message.one_more_field',
+           'google.protobuf.python.internal.another_field'],).issubset(set(
+               ext.full_name
+               for ext in factory_msg1.DESCRIPTOR.file.pool.FindAllExtensions(
+                   factory_msg1.DESCRIPTOR))))
       msg1 = messages['google.protobuf.python.internal.Factory1Message']()
-      ext1 = factory_msg1._extensions_by_name[
-          'google.protobuf.python.internal.Factory2Message.one_more_field']
-      ext2 = factory_msg1._extensions_by_name[
-          'google.protobuf.python.internal.another_field']
+      ext1 = msg1.Extensions._FindExtensionByName(
+          'google.protobuf.python.internal.Factory2Message.one_more_field')
+      ext2 = msg1.Extensions._FindExtensionByName(
+          'google.protobuf.python.internal.another_field')
       msg1.Extensions[ext1] = 'test1'
       msg1.Extensions[ext2] = 'test2'
       self.assertEqual('test1', msg1.Extensions[ext1])
       self.assertEqual('test2', msg1.Extensions[ext2])
+      self.assertEqual(None,
+                       msg1.Extensions._FindExtensionByNumber(12321))
+      if api_implementation.Type() == 'cpp':
+        # TODO(jieluo): Fix len to return the correct value.
+        # self.assertEqual(2, len(msg1.Extensions))
+        self.assertEqual(len(msg1.Extensions), len(msg1.Extensions))
+        self.assertRaises(TypeError,
+                          msg1.Extensions._FindExtensionByName, 0)
+        self.assertRaises(TypeError,
+                          msg1.Extensions._FindExtensionByNumber, '')
+      else:
+        self.assertEqual(None,
+                         msg1.Extensions._FindExtensionByName(0))
+        self.assertEqual(None,
+                         msg1.Extensions._FindExtensionByNumber(''))
+
+  def testDuplicateExtensionNumber(self):
+    pool = descriptor_pool.DescriptorPool()
+    factory = message_factory.MessageFactory(pool=pool)
+
+    # Add Container message.
+    f = descriptor_pb2.FileDescriptorProto()
+    f.name = 'google/protobuf/internal/container.proto'
+    f.package = 'google.protobuf.python.internal'
+    msg = f.message_type.add()
+    msg.name = 'Container'
+    rng = msg.extension_range.add()
+    rng.start = 1
+    rng.end = 10
+    pool.Add(f)
+    msgs = factory.GetMessages([f.name])
+    self.assertIn('google.protobuf.python.internal.Container', msgs)
+
+    # Extend container.
+    f = descriptor_pb2.FileDescriptorProto()
+    f.name = 'google/protobuf/internal/extension.proto'
+    f.package = 'google.protobuf.python.internal'
+    f.dependency.append('google/protobuf/internal/container.proto')
+    msg = f.message_type.add()
+    msg.name = 'Extension'
+    ext = msg.extension.add()
+    ext.name = 'extension_field'
+    ext.number = 2
+    ext.label = descriptor_pb2.FieldDescriptorProto.LABEL_OPTIONAL
+    ext.type_name = 'Extension'
+    ext.extendee = 'Container'
+    pool.Add(f)
+    msgs = factory.GetMessages([f.name])
+    self.assertIn('google.protobuf.python.internal.Extension', msgs)
+
+    # Add Duplicate extending the same field number.
+    f = descriptor_pb2.FileDescriptorProto()
+    f.name = 'google/protobuf/internal/duplicate.proto'
+    f.package = 'google.protobuf.python.internal'
+    f.dependency.append('google/protobuf/internal/container.proto')
+    msg = f.message_type.add()
+    msg.name = 'Duplicate'
+    ext = msg.extension.add()
+    ext.name = 'extension_field'
+    ext.number = 2
+    ext.label = descriptor_pb2.FieldDescriptorProto.LABEL_OPTIONAL
+    ext.type_name = 'Duplicate'
+    ext.extendee = 'Container'
+    pool.Add(f)
+
+    with self.assertRaises(Exception) as cm:
+      factory.GetMessages([f.name])
+
+    self.assertIn(str(cm.exception),
+                  ['Extensions '
+                   '"google.protobuf.python.internal.Duplicate.extension_field" and'
+                   ' "google.protobuf.python.internal.Extension.extension_field"'
+                   ' both try to extend message type'
+                   ' "google.protobuf.python.internal.Container"'
+                   ' with field number 2.',
+                   'Double registration of Extensions'])
 
 
 if __name__ == '__main__':
diff --git a/python/google/protobuf/internal/message_test.py b/python/google/protobuf/internal/message_test.py
index d03f2d2..61a56a6 100755
--- a/python/google/protobuf/internal/message_test.py
+++ b/python/google/protobuf/internal/message_test.py
@@ -51,24 +51,37 @@
 import pickle
 import six
 import sys
+import warnings
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  # PY26
 except ImportError:
   import unittest
-from google.protobuf.internal import _parameterized
+try:
+  cmp                                   # Python 2
+except NameError:
+  cmp = lambda x, y: (x > y) - (x < y)  # Python 3
+
+from google.protobuf import map_proto2_unittest_pb2
 from google.protobuf import map_unittest_pb2
 from google.protobuf import unittest_pb2
 from google.protobuf import unittest_proto3_arena_pb2
-from google.protobuf.internal import any_test_pb2
+from google.protobuf import descriptor_pb2
+from google.protobuf import descriptor_pool
+from google.protobuf import message_factory
+from google.protobuf import text_format
 from google.protobuf.internal import api_implementation
+from google.protobuf.internal import encoder
 from google.protobuf.internal import packed_field_test_pb2
 from google.protobuf.internal import test_util
+from google.protobuf.internal import testing_refleaks
 from google.protobuf import message
+from google.protobuf.internal import _parameterized
 
 if six.PY3:
   long = int
 
+
 # Python pre-2.6 does not have isinf() or isnan() functions, so we have
 # to provide our own.
 def isnan(val):
@@ -83,10 +96,13 @@
   return isinf(val) and (val < 0)
 
 
-@_parameterized.Parameters(
-    (unittest_pb2),
-    (unittest_proto3_arena_pb2))
-class MessageTest(unittest.TestCase):
+BaseTestCase = testing_refleaks.BaseTestCase
+
+
+@_parameterized.named_parameters(
+    ('_proto2', unittest_pb2),
+    ('_proto3', unittest_proto3_arena_pb2))
+class MessageTest(BaseTestCase):
 
   def testBadUtf8String(self, message_module):
     if api_implementation.Type() != 'python':
@@ -127,6 +143,63 @@
     golden_copy = copy.deepcopy(golden_message)
     self.assertEqual(golden_data, golden_copy.SerializeToString())
 
+  def testParseErrors(self, message_module):
+    msg = message_module.TestAllTypes()
+    self.assertRaises(TypeError, msg.FromString, 0)
+    self.assertRaises(Exception, msg.FromString, '0')
+    # TODO(jieluo): Fix cpp extension to raise error instead of warning.
+    # b/27494216
+    end_tag = encoder.TagBytes(1, 4)
+    if api_implementation.Type() == 'python':
+      with self.assertRaises(message.DecodeError) as context:
+        msg.FromString(end_tag)
+      self.assertEqual('Unexpected end-group tag.', str(context.exception))
+    else:
+      with warnings.catch_warnings(record=True) as w:
+        # Cause all warnings to always be triggered.
+        warnings.simplefilter('always')
+        msg.FromString(end_tag)
+        assert len(w) == 1
+        assert issubclass(w[-1].category, RuntimeWarning)
+        self.assertEqual('Unexpected end-group tag: Not all data was converted',
+                         str(w[-1].message))
+
+  def testDeterminismParameters(self, message_module):
+    # This message is always deterministically serialized, even if determinism
+    # is disabled, so we can use it to verify that all the determinism
+    # parameters work correctly.
+    golden_data = (b'\xe2\x02\nOne string'
+                   b'\xe2\x02\nTwo string'
+                   b'\xe2\x02\nRed string'
+                   b'\xe2\x02\x0bBlue string')
+    golden_message = message_module.TestAllTypes()
+    golden_message.repeated_string.extend([
+        'One string',
+        'Two string',
+        'Red string',
+        'Blue string',
+    ])
+    self.assertEqual(golden_data,
+                     golden_message.SerializeToString(deterministic=None))
+    self.assertEqual(golden_data,
+                     golden_message.SerializeToString(deterministic=False))
+    self.assertEqual(golden_data,
+                     golden_message.SerializeToString(deterministic=True))
+
+    class BadArgError(Exception):
+      pass
+
+    class BadArg(object):
+
+      def __nonzero__(self):
+        raise BadArgError()
+
+      def __bool__(self):
+        raise BadArgError()
+
+    with self.assertRaises(BadArgError):
+      golden_message.SerializeToString(deterministic=BadArg())
+
   def testPickleSupport(self, message_module):
     golden_data = test_util.GoldenFileData('golden_message')
     golden_message = message_module.TestAllTypes()
@@ -368,6 +441,7 @@
     self.assertEqual(message.repeated_int32[0], 1)
     self.assertEqual(message.repeated_int32[1], 2)
     self.assertEqual(message.repeated_int32[2], 3)
+    self.assertEqual(str(message.repeated_int32), str([1, 2, 3]))
 
     message.repeated_float.append(1.1)
     message.repeated_float.append(1.3)
@@ -384,6 +458,7 @@
     self.assertEqual(message.repeated_string[0], 'a')
     self.assertEqual(message.repeated_string[1], 'b')
     self.assertEqual(message.repeated_string[2], 'c')
+    self.assertEqual(str(message.repeated_string), str([u'a', u'b', u'c']))
 
     message.repeated_bytes.append(b'a')
     message.repeated_bytes.append(b'c')
@@ -392,6 +467,7 @@
     self.assertEqual(message.repeated_bytes[0], b'a')
     self.assertEqual(message.repeated_bytes[1], b'b')
     self.assertEqual(message.repeated_bytes[2], b'c')
+    self.assertEqual(str(message.repeated_bytes), str([b'a', b'b', b'c']))
 
   def testSortingRepeatedScalarFieldsCustomComparator(self, message_module):
     """Check some different types with custom comparator."""
@@ -430,6 +506,8 @@
     self.assertEqual(message.repeated_nested_message[3].bb, 4)
     self.assertEqual(message.repeated_nested_message[4].bb, 5)
     self.assertEqual(message.repeated_nested_message[5].bb, 6)
+    self.assertEqual(str(message.repeated_nested_message),
+                     '[bb: 1\n, bb: 2\n, bb: 3\n, bb: 4\n, bb: 5\n, bb: 6\n]')
 
   def testSortingRepeatedCompositeFieldsStable(self, message_module):
     """Check passing a custom comparator to sort a repeated composite field."""
@@ -555,6 +633,18 @@
     self.assertIsInstance(m.repeated_nested_message,
                           collections.MutableSequence)
 
+  def testRepeatedFieldsNotHashable(self, message_module):
+    m = message_module.TestAllTypes()
+    with self.assertRaises(TypeError):
+      hash(m.repeated_int32)
+    with self.assertRaises(TypeError):
+      hash(m.repeated_nested_message)
+
+  def testRepeatedFieldInsideNestedMessage(self, message_module):
+    m = message_module.NestedTestAllTypes()
+    m.payload.repeated_int32.extend([])
+    self.assertTrue(m.HasField('payload'))
+
   def ensureNestedMessageExists(self, msg, attribute):
     """Make sure that a nested message object exists.
 
@@ -567,6 +657,7 @@
   def testOneofGetCaseNonexistingField(self, message_module):
     m = message_module.TestAllTypes()
     self.assertRaises(ValueError, m.WhichOneof, 'no_such_oneof_field')
+    self.assertRaises(Exception, m.WhichOneof, 0)
 
   def testOneofDefaultValues(self, message_module):
     m = message_module.TestAllTypes()
@@ -942,6 +1033,8 @@
     m = message_module.TestAllTypes()
     with self.assertRaises(IndexError) as _:
       m.repeated_nested_message.pop()
+    with self.assertRaises(TypeError) as _:
+      m.repeated_nested_message.pop('0')
     for i in range(5):
       n = m.repeated_nested_message.add()
       n.bb = i
@@ -950,9 +1043,42 @@
     self.assertEqual(2, m.repeated_nested_message.pop(1).bb)
     self.assertEqual([1, 3], [n.bb for n in m.repeated_nested_message])
 
+  def testRepeatedCompareWithSelf(self, message_module):
+    m = message_module.TestAllTypes()
+    for i in range(5):
+      m.repeated_int32.insert(i, i)
+      n = m.repeated_nested_message.add()
+      n.bb = i
+    self.assertSequenceEqual(m.repeated_int32, m.repeated_int32)
+    self.assertEqual(m.repeated_nested_message, m.repeated_nested_message)
+
+  def testReleasedNestedMessages(self, message_module):
+    """A case that lead to a segfault when a message detached from its parent
+    container has itself a child container.
+    """
+    m = message_module.NestedTestAllTypes()
+    m = m.repeated_child.add()
+    m = m.child
+    m = m.repeated_child.add()
+    self.assertEqual(m.payload.optional_int32, 0)
+
+  def testSetRepeatedComposite(self, message_module):
+    m = message_module.TestAllTypes()
+    with self.assertRaises(AttributeError):
+      m.repeated_int32 = []
+    m.repeated_int32.append(1)
+    if api_implementation.Type() == 'cpp':
+      # For test coverage: cpp has a different path if composite
+      # field is in cache
+      with self.assertRaises(TypeError):
+        m.repeated_int32 = []
+    else:
+      with self.assertRaises(AttributeError):
+        m.repeated_int32 = []
+
 
 # Class to test proto2-only features (required, extensions, etc.)
-class Proto2Test(unittest.TestCase):
+class Proto2Test(BaseTestCase):
 
   def testFieldPresence(self):
     message = unittest_pb2.TestAllTypes()
@@ -1002,18 +1128,46 @@
     self.assertEqual(False, message.optional_bool)
     self.assertEqual(0, message.optional_nested_message.bb)
 
-  # TODO(tibell): The C++ implementations actually allows assignment
-  # of unknown enum values to *scalar* fields (but not repeated
-  # fields). Once checked enum fields becomes the default in the
-  # Python implementation, the C++ implementation should follow suit.
   def testAssignInvalidEnum(self):
-    """It should not be possible to assign an invalid enum number to an
-    enum field."""
+    """Assigning an invalid enum number is not allowed in proto2."""
     m = unittest_pb2.TestAllTypes()
 
+    # Proto2 can not assign unknown enum.
     with self.assertRaises(ValueError) as _:
       m.optional_nested_enum = 1234567
     self.assertRaises(ValueError, m.repeated_nested_enum.append, 1234567)
+    # Assignment is a different code path than append for the C++ impl.
+    m.repeated_nested_enum.append(2)
+    m.repeated_nested_enum[0] = 2
+    with self.assertRaises(ValueError):
+      m.repeated_nested_enum[0] = 123456
+
+    # Unknown enum value can be parsed but is ignored.
+    m2 = unittest_proto3_arena_pb2.TestAllTypes()
+    m2.optional_nested_enum = 1234567
+    m2.repeated_nested_enum.append(7654321)
+    serialized = m2.SerializeToString()
+
+    m3 = unittest_pb2.TestAllTypes()
+    m3.ParseFromString(serialized)
+    self.assertFalse(m3.HasField('optional_nested_enum'))
+    # 1 is the default value for optional_nested_enum.
+    self.assertEqual(1, m3.optional_nested_enum)
+    self.assertEqual(0, len(m3.repeated_nested_enum))
+    m2.Clear()
+    m2.ParseFromString(m3.SerializeToString())
+    self.assertEqual(1234567, m2.optional_nested_enum)
+    self.assertEqual(7654321, m2.repeated_nested_enum[0])
+
+  def testUnknownEnumMap(self):
+    m = map_proto2_unittest_pb2.TestEnumMap()
+    m.known_map_field[123] = 0
+    with self.assertRaises(ValueError):
+      m.unknown_map_field[1] = 123
+
+  def testExtensionsErrors(self):
+    msg = unittest_pb2.TestAllTypes()
+    self.assertRaises(AttributeError, getattr, msg, 'Extensions')
 
   def testGoldenExtensions(self):
     golden_data = test_util.GoldenFileData('golden_message')
@@ -1108,6 +1262,7 @@
         optional_bytes=b'x',
         optionalgroup={'a': 400},
         optional_nested_message={'bb': 500},
+        optional_foreign_message={},
         optional_nested_enum='BAZ',
         repeatedgroup=[{'a': 600},
                        {'a': 700}],
@@ -1120,8 +1275,12 @@
     self.assertEqual(300.5, message.optional_float)
     self.assertEqual(b'x', message.optional_bytes)
     self.assertEqual(400, message.optionalgroup.a)
-    self.assertIsInstance(message.optional_nested_message, unittest_pb2.TestAllTypes.NestedMessage)
+    self.assertIsInstance(message.optional_nested_message,
+                          unittest_pb2.TestAllTypes.NestedMessage)
     self.assertEqual(500, message.optional_nested_message.bb)
+    self.assertTrue(message.HasField('optional_foreign_message'))
+    self.assertEqual(message.optional_foreign_message,
+                     unittest_pb2.ForeignMessage())
     self.assertEqual(unittest_pb2.TestAllTypes.BAZ,
                      message.optional_nested_enum)
     self.assertEqual(2, len(message.repeatedgroup))
@@ -1157,8 +1316,9 @@
       unittest_pb2.TestAllTypes(repeated_nested_enum='FOO')
 
 
+
 # Class to test proto3-only features/behavior (updated field presence & enums)
-class Proto3Test(unittest.TestCase):
+class Proto3Test(BaseTestCase):
 
   # Utility method for comparing equality with a map.
   def assertMapIterEquals(self, map_iter, dict_value):
@@ -1232,6 +1392,7 @@
     """Assigning an unknown enum value is allowed and preserves the value."""
     m = unittest_proto3_arena_pb2.TestAllTypes()
 
+    # Proto3 can assign unknown enums.
     m.optional_nested_enum = 1234567
     self.assertEqual(1234567, m.optional_nested_enum)
     m.repeated_nested_enum.append(22334455)
@@ -1249,7 +1410,7 @@
   # Map isn't really a proto3-only feature. But there is no proto2 equivalent
   # of google/protobuf/map_unittest.proto right now, so it's not easy to
   # test both with the same test like we do for the other proto2/proto3 tests.
-  # (google/protobuf/map_protobuf_unittest.proto is very different in the set
+  # (google/protobuf/map_proto2_unittest.proto is very different in the set
   # of messages and fields it contains).
   def testScalarMapDefaults(self):
     msg = map_unittest_pb2.TestMap()
@@ -1259,7 +1420,10 @@
     self.assertFalse(-2**33 in msg.map_int64_int64)
     self.assertFalse(123 in msg.map_uint32_uint32)
     self.assertFalse(2**33 in msg.map_uint64_uint64)
+    self.assertFalse(123 in msg.map_int32_double)
+    self.assertFalse(False in msg.map_bool_bool)
     self.assertFalse('abc' in msg.map_string_string)
+    self.assertFalse(111 in msg.map_int32_bytes)
     self.assertFalse(888 in msg.map_int32_enum)
 
     # Accessing an unset key returns the default.
@@ -1267,7 +1431,12 @@
     self.assertEqual(0, msg.map_int64_int64[-2**33])
     self.assertEqual(0, msg.map_uint32_uint32[123])
     self.assertEqual(0, msg.map_uint64_uint64[2**33])
+    self.assertEqual(0.0, msg.map_int32_double[123])
+    self.assertTrue(isinstance(msg.map_int32_double[123], float))
+    self.assertEqual(False, msg.map_bool_bool[False])
+    self.assertTrue(isinstance(msg.map_bool_bool[False], bool))
     self.assertEqual('', msg.map_string_string['abc'])
+    self.assertEqual(b'', msg.map_int32_bytes[111])
     self.assertEqual(0, msg.map_int32_enum[888])
 
     # It also sets the value in the map
@@ -1275,7 +1444,10 @@
     self.assertTrue(-2**33 in msg.map_int64_int64)
     self.assertTrue(123 in msg.map_uint32_uint32)
     self.assertTrue(2**33 in msg.map_uint64_uint64)
+    self.assertTrue(123 in msg.map_int32_double)
+    self.assertTrue(False in msg.map_bool_bool)
     self.assertTrue('abc' in msg.map_string_string)
+    self.assertTrue(111 in msg.map_int32_bytes)
     self.assertTrue(888 in msg.map_int32_enum)
 
     self.assertIsInstance(msg.map_string_string['abc'], six.text_type)
@@ -1299,12 +1471,17 @@
 
     msg.map_int32_int32[5] = 15
     self.assertEqual(15, msg.map_int32_int32.get(5))
+    self.assertEqual(15, msg.map_int32_int32.get(5))
+    with self.assertRaises(TypeError):
+      msg.map_int32_int32.get('')
 
     self.assertIsNone(msg.map_int32_foreign_message.get(5))
     self.assertEqual(10, msg.map_int32_foreign_message.get(5, 10))
 
     submsg = msg.map_int32_foreign_message[5]
     self.assertIs(submsg, msg.map_int32_foreign_message.get(5))
+    with self.assertRaises(TypeError):
+      msg.map_int32_foreign_message.get('')
 
   def testScalarMap(self):
     msg = map_unittest_pb2.TestMap()
@@ -1316,8 +1493,13 @@
     msg.map_int64_int64[-2**33] = -2**34
     msg.map_uint32_uint32[123] = 456
     msg.map_uint64_uint64[2**33] = 2**34
+    msg.map_int32_float[2] = 1.2
+    msg.map_int32_double[1] = 3.3
     msg.map_string_string['abc'] = '123'
+    msg.map_bool_bool[True] = True
     msg.map_int32_enum[888] = 2
+    # Unknown numeric enum is supported in proto3.
+    msg.map_int32_enum[123] = 456
 
     self.assertEqual([], msg.FindInitializationErrors())
 
@@ -1351,8 +1533,24 @@
     self.assertEqual(-2**34, msg2.map_int64_int64[-2**33])
     self.assertEqual(456, msg2.map_uint32_uint32[123])
     self.assertEqual(2**34, msg2.map_uint64_uint64[2**33])
+    self.assertAlmostEqual(1.2, msg.map_int32_float[2])
+    self.assertEqual(3.3, msg.map_int32_double[1])
     self.assertEqual('123', msg2.map_string_string['abc'])
+    self.assertEqual(True, msg2.map_bool_bool[True])
     self.assertEqual(2, msg2.map_int32_enum[888])
+    self.assertEqual(456, msg2.map_int32_enum[123])
+    # TODO(jieluo): Add cpp extension support.
+    if api_implementation.Type() == 'python':
+      self.assertEqual('{-123: -456}',
+                       str(msg2.map_int32_int32))
+
+  def testMapEntryAlwaysSerialized(self):
+    msg = map_unittest_pb2.TestMap()
+    msg.map_int32_int32[0] = 0
+    msg.map_string_string[''] = ''
+    self.assertEqual(msg.ByteSize(), 12)
+    self.assertEqual(b'\n\x04\x08\x00\x10\x00r\x04\n\x00\x12\x00',
+                     msg.SerializeToString())
 
   def testStringUnicodeConversionInMap(self):
     msg = map_unittest_pb2.TestMap()
@@ -1405,6 +1603,40 @@
     self.assertIn(123, msg2.map_int32_foreign_message)
     self.assertIn(-456, msg2.map_int32_foreign_message)
     self.assertEqual(2, len(msg2.map_int32_foreign_message))
+    # TODO(jieluo): Fix text format for message map.
+    # TODO(jieluo): Add cpp extension support.
+    if api_implementation.Type() == 'python':
+      self.assertEqual(15,
+                       len(str(msg2.map_int32_foreign_message)))
+
+  def testNestedMessageMapItemDelete(self):
+    msg = map_unittest_pb2.TestMap()
+    msg.map_int32_all_types[1].optional_nested_message.bb = 1
+    del msg.map_int32_all_types[1]
+    msg.map_int32_all_types[2].optional_nested_message.bb = 2
+    self.assertEqual(1, len(msg.map_int32_all_types))
+    msg.map_int32_all_types[1].optional_nested_message.bb = 1
+    self.assertEqual(2, len(msg.map_int32_all_types))
+
+    serialized = msg.SerializeToString()
+    msg2 = map_unittest_pb2.TestMap()
+    msg2.ParseFromString(serialized)
+    keys = [1, 2]
+    # The loop triggers PyErr_Occurred() in c extension.
+    for key in keys:
+      del msg2.map_int32_all_types[key]
+
+  def testMapByteSize(self):
+    msg = map_unittest_pb2.TestMap()
+    msg.map_int32_int32[1] = 1
+    size = msg.ByteSize()
+    msg.map_int32_int32[1] = 128
+    self.assertEqual(msg.ByteSize(), size + 1)
+
+    msg.map_int32_foreign_message[19].c = 1
+    size = msg.ByteSize()
+    msg.map_int32_foreign_message[19].c = 128
+    self.assertEqual(msg.ByteSize(), size + 1)
 
   def testMergeFrom(self):
     msg = map_unittest_pb2.TestMap()
@@ -1418,6 +1650,8 @@
     msg2.map_int32_int32[12] = 55
     msg2.map_int64_int64[88] = 99
     msg2.map_int32_foreign_message[222].c = 15
+    msg2.map_int32_foreign_message[222].d = 20
+    old_map_value = msg2.map_int32_foreign_message[222]
 
     msg2.MergeFrom(msg)
 
@@ -1427,6 +1661,16 @@
     self.assertEqual(99, msg2.map_int64_int64[88])
     self.assertEqual(5, msg2.map_int32_foreign_message[111].c)
     self.assertEqual(10, msg2.map_int32_foreign_message[222].c)
+    self.assertFalse(msg2.map_int32_foreign_message[222].HasField('d'))
+    if api_implementation.Type() != 'cpp':
+      # During the call to MergeFrom(), the C++ implementation will have
+      # deallocated the underlying message, but this is very difficult to detect
+      # properly. The line below is likely to cause a segmentation fault.
+      # With the Python implementation, old_map_value is just 'detached' from
+      # the main message. Using it will not crash of course, but since it still
+      # have a reference to the parent message I'm sure we can find interesting
+      # ways to cause inconsistencies.
+      self.assertEqual(15, old_map_value.c)
 
     # Verify that there is only one entry per key, even though the MergeFrom
     # may have internally created multiple entries for a single key in the
@@ -1447,6 +1691,51 @@
 
     del msg2.map_int32_foreign_message[222]
     self.assertFalse(222 in msg2.map_int32_foreign_message)
+    with self.assertRaises(TypeError):
+      del msg2.map_int32_foreign_message['']
+
+  def testMapMergeFrom(self):
+    msg = map_unittest_pb2.TestMap()
+    msg.map_int32_int32[12] = 34
+    msg.map_int32_int32[56] = 78
+    msg.map_int64_int64[22] = 33
+    msg.map_int32_foreign_message[111].c = 5
+    msg.map_int32_foreign_message[222].c = 10
+
+    msg2 = map_unittest_pb2.TestMap()
+    msg2.map_int32_int32[12] = 55
+    msg2.map_int64_int64[88] = 99
+    msg2.map_int32_foreign_message[222].c = 15
+    msg2.map_int32_foreign_message[222].d = 20
+
+    msg2.map_int32_int32.MergeFrom(msg.map_int32_int32)
+    self.assertEqual(34, msg2.map_int32_int32[12])
+    self.assertEqual(78, msg2.map_int32_int32[56])
+
+    msg2.map_int64_int64.MergeFrom(msg.map_int64_int64)
+    self.assertEqual(33, msg2.map_int64_int64[22])
+    self.assertEqual(99, msg2.map_int64_int64[88])
+
+    msg2.map_int32_foreign_message.MergeFrom(msg.map_int32_foreign_message)
+    self.assertEqual(5, msg2.map_int32_foreign_message[111].c)
+    self.assertEqual(10, msg2.map_int32_foreign_message[222].c)
+    self.assertFalse(msg2.map_int32_foreign_message[222].HasField('d'))
+
+  def testMergeFromBadType(self):
+    msg = map_unittest_pb2.TestMap()
+    with self.assertRaisesRegexp(
+        TypeError,
+        r'Parameter to MergeFrom\(\) must be instance of same class: expected '
+        r'.*TestMap got int\.'):
+      msg.MergeFrom(1)
+
+  def testCopyFromBadType(self):
+    msg = map_unittest_pb2.TestMap()
+    with self.assertRaisesRegexp(
+        TypeError,
+        r'Parameter to [A-Za-z]*From\(\) must be instance of same class: '
+        r'expected .*TestMap got int\.'):
+      msg.CopyFrom(1)
 
   def testIntegerMapWithLongs(self):
     msg = map_unittest_pb2.TestMap()
@@ -1565,6 +1854,98 @@
     matching_dict = {2: 4, 3: 6, 4: 8}
     self.assertMapIterEquals(msg.map_int32_int32.items(), matching_dict)
 
+  def testPython2Map(self):
+    if sys.version_info < (3,):
+      msg = map_unittest_pb2.TestMap()
+      msg.map_int32_int32[2] = 4
+      msg.map_int32_int32[3] = 6
+      msg.map_int32_int32[4] = 8
+      msg.map_int32_int32[5] = 10
+      map_int32 = msg.map_int32_int32
+      self.assertEqual(4, len(map_int32))
+      msg2 = map_unittest_pb2.TestMap()
+      msg2.ParseFromString(msg.SerializeToString())
+
+      def CheckItems(seq, iterator):
+        self.assertEqual(next(iterator), seq[0])
+        self.assertEqual(list(iterator), seq[1:])
+
+      CheckItems(map_int32.items(), map_int32.iteritems())
+      CheckItems(map_int32.keys(), map_int32.iterkeys())
+      CheckItems(map_int32.values(), map_int32.itervalues())
+
+      self.assertEqual(6, map_int32.get(3))
+      self.assertEqual(None, map_int32.get(999))
+      self.assertEqual(6, map_int32.pop(3))
+      self.assertEqual(0, map_int32.pop(3))
+      self.assertEqual(3, len(map_int32))
+      key, value = map_int32.popitem()
+      self.assertEqual(2 * key, value)
+      self.assertEqual(2, len(map_int32))
+      map_int32.clear()
+      self.assertEqual(0, len(map_int32))
+
+      with self.assertRaises(KeyError):
+        map_int32.popitem()
+
+      self.assertEqual(0, map_int32.setdefault(2))
+      self.assertEqual(1, len(map_int32))
+
+      map_int32.update(msg2.map_int32_int32)
+      self.assertEqual(4, len(map_int32))
+
+      with self.assertRaises(TypeError):
+        map_int32.update(msg2.map_int32_int32,
+                         msg2.map_int32_int32)
+      with self.assertRaises(TypeError):
+        map_int32.update(0)
+      with self.assertRaises(TypeError):
+        map_int32.update(value=12)
+
+  def testMapItems(self):
+    # Map items used to have strange behaviors when use c extension. Because
+    # [] may reorder the map and invalidate any exsting iterators.
+    # TODO(jieluo): Check if [] reordering the map is a bug or intended
+    # behavior.
+    msg = map_unittest_pb2.TestMap()
+    msg.map_string_string['local_init_op'] = ''
+    msg.map_string_string['trainable_variables'] = ''
+    msg.map_string_string['variables'] = ''
+    msg.map_string_string['init_op'] = ''
+    msg.map_string_string['summaries'] = ''
+    items1 = msg.map_string_string.items()
+    items2 = msg.map_string_string.items()
+    self.assertEqual(items1, items2)
+
+  def testMapDeterministicSerialization(self):
+    golden_data = (b'r\x0c\n\x07init_op\x12\x01d'
+                   b'r\n\n\x05item1\x12\x01e'
+                   b'r\n\n\x05item2\x12\x01f'
+                   b'r\n\n\x05item3\x12\x01g'
+                   b'r\x0b\n\x05item4\x12\x02QQ'
+                   b'r\x12\n\rlocal_init_op\x12\x01a'
+                   b'r\x0e\n\tsummaries\x12\x01e'
+                   b'r\x18\n\x13trainable_variables\x12\x01b'
+                   b'r\x0e\n\tvariables\x12\x01c')
+    msg = map_unittest_pb2.TestMap()
+    msg.map_string_string['local_init_op'] = 'a'
+    msg.map_string_string['trainable_variables'] = 'b'
+    msg.map_string_string['variables'] = 'c'
+    msg.map_string_string['init_op'] = 'd'
+    msg.map_string_string['summaries'] = 'e'
+    msg.map_string_string['item1'] = 'e'
+    msg.map_string_string['item2'] = 'f'
+    msg.map_string_string['item3'] = 'g'
+    msg.map_string_string['item4'] = 'QQ'
+
+    # If deterministic serialization is not working correctly, this will be
+    # "flaky" depending on the exact python dict hash seed.
+    #
+    # Fortunately, there are enough items in this map that it is extremely
+    # unlikely to ever hit the "right" in-order combination, so the test
+    # itself should fail reliably.
+    self.assertEqual(golden_data, msg.SerializeToString(deterministic=True))
+
   def testMapIterationClearMessage(self):
     # Iterator needs to work even if message and map are deleted.
     msg = map_unittest_pb2.TestMap()
@@ -1651,6 +2032,9 @@
     del msg.map_int32_int32[4]
     self.assertEqual(0, len(msg.map_int32_int32))
 
+    with self.assertRaises(KeyError):
+      del msg.map_int32_all_types[32]
+
   def testMapsAreMapping(self):
     msg = map_unittest_pb2.TestMap()
     self.assertIsInstance(msg.map_int32_int32, collections.Mapping)
@@ -1659,6 +2043,14 @@
     self.assertIsInstance(msg.map_int32_foreign_message,
                           collections.MutableMapping)
 
+  def testMapsCompare(self):
+    msg = map_unittest_pb2.TestMap()
+    msg.map_int32_int32[-123] = -456
+    self.assertEqual(msg.map_int32_int32, msg.map_int32_int32)
+    self.assertEqual(msg.map_int32_foreign_message,
+                     msg.map_int32_foreign_message)
+    self.assertNotEqual(msg.map_int32_int32, 0)
+
   def testMapFindInitializationErrorsSmokeTest(self):
     msg = map_unittest_pb2.TestMap()
     msg.map_string_string['abc'] = '123'
@@ -1666,40 +2058,9 @@
     msg.map_string_foreign_message['foo'].c = 5
     self.assertEqual(0, len(msg.FindInitializationErrors()))
 
-  def testAnyMessage(self):
-    # Creates and sets message.
-    msg = any_test_pb2.TestAny()
-    msg_descriptor = msg.DESCRIPTOR
-    all_types = unittest_pb2.TestAllTypes()
-    all_descriptor = all_types.DESCRIPTOR
-    all_types.repeated_string.append(u'\u00fc\ua71f')
-    # Packs to Any.
-    msg.value.Pack(all_types)
-    self.assertEqual(msg.value.type_url,
-                     'type.googleapis.com/%s' % all_descriptor.full_name)
-    self.assertEqual(msg.value.value,
-                     all_types.SerializeToString())
-    # Tests Is() method.
-    self.assertTrue(msg.value.Is(all_descriptor))
-    self.assertFalse(msg.value.Is(msg_descriptor))
-    # Unpacks Any.
-    unpacked_message = unittest_pb2.TestAllTypes()
-    self.assertTrue(msg.value.Unpack(unpacked_message))
-    self.assertEqual(all_types, unpacked_message)
-    # Unpacks to different type.
-    self.assertFalse(msg.value.Unpack(msg))
-    # Only Any messages have Pack method.
-    try:
-      msg.Pack(all_types)
-    except AttributeError:
-      pass
-    else:
-      raise AttributeError('%s should not have Pack method.' %
-                           msg_descriptor.full_name)
 
 
-
-class ValidTypeNamesTest(unittest.TestCase):
+class ValidTypeNamesTest(BaseTestCase):
 
   def assertImportFromName(self, msg, base_name):
     # Parse <type 'module.class_name'> to extra 'some.name' as a string.
@@ -1720,7 +2081,7 @@
     self.assertImportFromName(pb.repeated_int32, 'Scalar')
     self.assertImportFromName(pb.repeated_nested_message, 'Composite')
 
-class PackedFieldTest(unittest.TestCase):
+class PackedFieldTest(BaseTestCase):
 
   def setMessage(self, message):
     message.repeated_int32.append(1)
@@ -1776,5 +2137,67 @@
                    b'\x70\x01')
     self.assertEqual(golden_data, message.SerializeToString())
 
+
+@unittest.skipIf(api_implementation.Type() != 'cpp' or
+                 sys.version_info < (2, 7),
+                 'explicit tests of the C++ implementation for PY27 and above')
+class OversizeProtosTest(BaseTestCase):
+
+  @classmethod
+  def setUpClass(cls):
+    # At the moment, reference cycles between DescriptorPool and Message classes
+    # are not detected and these objects are never freed.
+    # To avoid errors with ReferenceLeakChecker, we create the class only once.
+    file_desc = """
+      name: "f/f.msg2"
+      package: "f"
+      message_type {
+        name: "msg1"
+        field {
+          name: "payload"
+          number: 1
+          label: LABEL_OPTIONAL
+          type: TYPE_STRING
+        }
+      }
+      message_type {
+        name: "msg2"
+        field {
+          name: "field"
+          number: 1
+          label: LABEL_OPTIONAL
+          type: TYPE_MESSAGE
+          type_name: "msg1"
+        }
+      }
+    """
+    pool = descriptor_pool.DescriptorPool()
+    desc = descriptor_pb2.FileDescriptorProto()
+    text_format.Parse(file_desc, desc)
+    pool.Add(desc)
+    cls.proto_cls = message_factory.MessageFactory(pool).GetPrototype(
+        pool.FindMessageTypeByName('f.msg2'))
+
+  def setUp(self):
+    self.p = self.proto_cls()
+    self.p.field.payload = 'c' * (1024 * 1024 * 64 + 1)
+    self.p_serialized = self.p.SerializeToString()
+
+  def testAssertOversizeProto(self):
+    from google.protobuf.pyext._message import SetAllowOversizeProtos
+    SetAllowOversizeProtos(False)
+    q = self.proto_cls()
+    try:
+      q.ParseFromString(self.p_serialized)
+    except message.DecodeError as e:
+      self.assertEqual(str(e), 'Error parsing message')
+
+  def testSucceedOversizeProto(self):
+    from google.protobuf.pyext._message import SetAllowOversizeProtos
+    SetAllowOversizeProtos(True)
+    q = self.proto_cls()
+    q.ParseFromString(self.p_serialized)
+    self.assertEqual(self.p.field.payload, q.field.payload)
+
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/more_extensions_dynamic.proto b/python/google/protobuf/internal/more_extensions_dynamic.proto
index 11f85ef..98fcbcb 100644
--- a/python/google/protobuf/internal/more_extensions_dynamic.proto
+++ b/python/google/protobuf/internal/more_extensions_dynamic.proto
@@ -47,4 +47,5 @@
 extend ExtendedMessage {
   optional int32 dynamic_int32_extension = 100;
   optional DynamicMessageType dynamic_message_extension = 101;
+  repeated DynamicMessageType repeated_dynamic_message_extension = 102;
 }
diff --git a/python/google/protobuf/internal/no_package.proto b/python/google/protobuf/internal/no_package.proto
new file mode 100644
index 0000000..3546dcc
--- /dev/null
+++ b/python/google/protobuf/internal/no_package.proto
@@ -0,0 +1,10 @@
+syntax = "proto2";
+
+enum NoPackageEnum {
+  NO_PACKAGE_VALUE_0 = 0;
+  NO_PACKAGE_VALUE_1 = 1;
+}
+
+message NoPackageMessage {
+  optional NoPackageEnum no_package_enum = 1;
+}
diff --git a/python/google/protobuf/internal/proto_builder_test.py b/python/google/protobuf/internal/proto_builder_test.py
index 822ad89..36dfbfd 100644
--- a/python/google/protobuf/internal/proto_builder_test.py
+++ b/python/google/protobuf/internal/proto_builder_test.py
@@ -40,6 +40,7 @@
   import unittest2 as unittest
 except ImportError:
   import unittest
+
 from google.protobuf import descriptor_pb2
 from google.protobuf import descriptor_pool
 from google.protobuf import proto_builder
diff --git a/python/google/protobuf/internal/python_message.py b/python/google/protobuf/internal/python_message.py
index 87f6066..975e3b4 100755
--- a/python/google/protobuf/internal/python_message.py
+++ b/python/google/protobuf/internal/python_message.py
@@ -51,14 +51,14 @@
 __author__ = 'robinson@google.com (Will Robinson)'
 
 from io import BytesIO
-import sys
 import struct
+import sys
 import weakref
 
 import six
-import six.moves.copyreg as copyreg
 
 # We use "as" to avoid name collisions with variables.
+from google.protobuf.internal import api_implementation
 from google.protobuf.internal import containers
 from google.protobuf.internal import decoder
 from google.protobuf.internal import encoder
@@ -69,7 +69,6 @@
 from google.protobuf.internal import wire_format
 from google.protobuf import descriptor as descriptor_mod
 from google.protobuf import message as message_mod
-from google.protobuf import symbol_database
 from google.protobuf import text_format
 
 _FieldDescriptor = descriptor_mod.FieldDescriptor
@@ -91,16 +90,12 @@
   classes at runtime, as in this example:
 
   mydescriptor = Descriptor(.....)
-  class MyProtoClass(Message):
-    __metaclass__ = GeneratedProtocolMessageType
-    DESCRIPTOR = mydescriptor
+  factory = symbol_database.Default()
+  factory.pool.AddDescriptor(mydescriptor)
+  MyProtoClass = factory.GetPrototype(mydescriptor)
   myproto_instance = MyProtoClass()
   myproto.foo_field = 23
   ...
-
-  The above example will not work for nested types. If you wish to include them,
-  use reflection.MakeClass() instead of manually instantiating the class in
-  order to create the appropriate class structure.
   """
 
   # Must be consistent with the protocol-compiler code in
@@ -157,12 +152,10 @@
     """
     descriptor = dictionary[GeneratedProtocolMessageType._DESCRIPTOR_KEY]
     cls._decoders_by_tag = {}
-    cls._extensions_by_name = {}
-    cls._extensions_by_number = {}
     if (descriptor.has_options and
         descriptor.GetOptions().message_set_wire_format):
       cls._decoders_by_tag[decoder.MESSAGE_SET_ITEM_TAG] = (
-          decoder.MessageSetItemDecoder(cls._extensions_by_number), None)
+          decoder.MessageSetItemDecoder(descriptor), None)
 
     # Attach stuff to each FieldDescriptor for quick lookup later on.
     for field in descriptor.fields:
@@ -176,7 +169,6 @@
     _AddStaticMethods(cls)
     _AddMessageMethods(descriptor, cls)
     _AddPrivateHelperMethods(descriptor, cls)
-    copyreg.pickle(cls, lambda obj: (cls, (), obj.__getstate__()))
 
     superclass = super(GeneratedProtocolMessageType, cls)
     superclass.__init__(name, bases, dictionary)
@@ -297,7 +289,8 @@
 
   if is_map_entry:
     field_encoder = encoder.MapEncoder(field_descriptor)
-    sizer = encoder.MapSizer(field_descriptor)
+    sizer = encoder.MapSizer(field_descriptor,
+                             _IsMessageMapField(field_descriptor))
   elif _IsMessageSetExtension(field_descriptor):
     field_encoder = encoder.MessageSetItemEncoder(field_descriptor.number)
     sizer = encoder.MessageSetItemSizer(field_descriptor.number)
@@ -378,13 +371,15 @@
   if _IsMessageMapField(field):
     def MakeMessageMapDefault(message):
       return containers.MessageMap(
-          message._listener_for_children, value_field.message_type, key_checker)
+          message._listener_for_children, value_field.message_type, key_checker,
+          field.message_type)
     return MakeMessageMapDefault
   else:
     value_checker = type_checkers.GetTypeChecker(value_field)
     def MakePrimitiveMapDefault(message):
       return containers.ScalarMap(
-          message._listener_for_children, key_checker, value_checker)
+          message._listener_for_children, key_checker, value_checker,
+          field.message_type)
     return MakePrimitiveMapDefault
 
 def _DefaultValueConstructorForField(field):
@@ -490,6 +485,9 @@
       if field is None:
         raise TypeError("%s() got an unexpected keyword argument '%s'" %
                         (message_descriptor.name, field_name))
+      if field_value is None:
+        # field=None is the same as no field at all.
+        continue
       if field.label == _FieldDescriptor.LABEL_REPEATED:
         copy = field._default_constructor(self)
         if field.cpp_type == _FieldDescriptor.CPPTYPE_MESSAGE:  # Composite
@@ -737,32 +735,21 @@
     constant_name = extension_name.upper() + "_FIELD_NUMBER"
     setattr(cls, constant_name, extension_field.number)
 
+  # TODO(amauryfa): Migrate all users of these attributes to functions like
+  #   pool.FindExtensionByNumber(descriptor).
+  if descriptor.file is not None:
+    # TODO(amauryfa): Use cls.MESSAGE_FACTORY.pool when available.
+    pool = descriptor.file.pool
+    cls._extensions_by_number = pool._extensions_by_number[descriptor]
+    cls._extensions_by_name = pool._extensions_by_name[descriptor]
 
 def _AddStaticMethods(cls):
   # TODO(robinson): This probably needs to be thread-safe(?)
   def RegisterExtension(extension_handle):
     extension_handle.containing_type = cls.DESCRIPTOR
+    # TODO(amauryfa): Use cls.MESSAGE_FACTORY.pool when available.
+    cls.DESCRIPTOR.file.pool.AddExtensionDescriptor(extension_handle)
     _AttachFieldHelpers(cls, extension_handle)
-
-    # Try to insert our extension, failing if an extension with the same number
-    # already exists.
-    actual_handle = cls._extensions_by_number.setdefault(
-        extension_handle.number, extension_handle)
-    if actual_handle is not extension_handle:
-      raise AssertionError(
-          'Extensions "%s" and "%s" both try to extend message type "%s" with '
-          'field number %d.' %
-          (extension_handle.full_name, actual_handle.full_name,
-           cls.DESCRIPTOR.full_name, extension_handle.number))
-
-    cls._extensions_by_name[extension_handle.full_name] = extension_handle
-
-    handle = extension_handle  # avoid line wrapping
-    if _IsMessageSetExtension(handle):
-      # MessageSet extension.  Also register under type name.
-      cls._extensions_by_name[
-          extension_handle.message_type.full_name] = extension_handle
-
   cls.RegisterExtension = staticmethod(RegisterExtension)
 
   def FromString(s):
@@ -889,17 +876,6 @@
   cls.ClearExtension = ClearExtension
 
 
-def _AddClearMethod(message_descriptor, cls):
-  """Helper for _AddMessageMethods()."""
-  def Clear(self):
-    # Clear fields.
-    self._fields = {}
-    self._unknown_fields = ()
-    self._oneofs = {}
-    self._Modified()
-  cls.Clear = Clear
-
-
 def _AddHasExtensionMethod(cls):
   """Helper for _AddMessageMethods()."""
   def HasExtension(self, extension_handle):
@@ -917,7 +893,7 @@
 def _InternalUnpackAny(msg):
   """Unpacks Any message and returns the unpacked message.
 
-  This internal method is differnt from public Any Unpack method which takes
+  This internal method is different from public Any Unpack method which takes
   the target message as argument. _InternalUnpackAny method does not have
   target message type and need to find the message type in descriptor pool.
 
@@ -927,26 +903,33 @@
   Returns:
     The unpacked message.
   """
+  # TODO(amauryfa): Don't use the factory of generated messages.
+  # To make Any work with custom factories, use the message factory of the
+  # parent message.
+  # pylint: disable=g-import-not-at-top
+  from google.protobuf import symbol_database
+  factory = symbol_database.Default()
+
   type_url = msg.type_url
-  db = symbol_database.Default()
 
   if not type_url:
     return None
 
   # TODO(haberman): For now we just strip the hostname.  Better logic will be
   # required.
-  type_name = type_url.split("/")[-1]
-  descriptor = db.pool.FindMessageTypeByName(type_name)
+  type_name = type_url.split('/')[-1]
+  descriptor = factory.pool.FindMessageTypeByName(type_name)
 
   if descriptor is None:
     return None
 
-  message_class = db.GetPrototype(descriptor)
+  message_class = factory.GetPrototype(descriptor)
   message = message_class()
 
   message.ParseFromString(msg.value)
   return message
 
+
 def _AddEqualsMethod(message_descriptor, cls):
   """Helper for _AddMessageMethods()."""
   def __eq__(self, other):
@@ -999,16 +982,6 @@
   cls.__unicode__ = __unicode__
 
 
-def _AddSetListenerMethod(cls):
-  """Helper for _AddMessageMethods()."""
-  def SetListener(self, listener):
-    if listener is None:
-      self._listener = message_listener_mod.NullMessageListener()
-    else:
-      self._listener = listener
-  cls._SetListener = SetListener
-
-
 def _BytesForNonRepeatedElement(value, field_number, field_type):
   """Returns the number of bytes needed to serialize a non-repeated element.
   The returned byte count includes space for tag information and any
@@ -1037,11 +1010,16 @@
       return self._cached_byte_size
 
     size = 0
-    for field_descriptor, field_value in self.ListFields():
-      size += field_descriptor._sizer(field_value)
-
-    for tag_bytes, value_bytes in self._unknown_fields:
-      size += len(tag_bytes) + len(value_bytes)
+    descriptor = self.DESCRIPTOR
+    if descriptor.GetOptions().map_entry:
+      # Fields of map entry should always be serialized.
+      size = descriptor.fields_by_name['key']._sizer(self.key)
+      size += descriptor.fields_by_name['value']._sizer(self.value)
+    else:
+      for field_descriptor, field_value in self.ListFields():
+        size += field_descriptor._sizer(field_value)
+      for tag_bytes, value_bytes in self._unknown_fields:
+        size += len(tag_bytes) + len(value_bytes)
 
     self._cached_byte_size = size
     self._cached_byte_size_dirty = False
@@ -1054,32 +1032,46 @@
 def _AddSerializeToStringMethod(message_descriptor, cls):
   """Helper for _AddMessageMethods()."""
 
-  def SerializeToString(self):
+  def SerializeToString(self, **kwargs):
     # Check if the message has all of its required fields set.
     errors = []
     if not self.IsInitialized():
       raise message_mod.EncodeError(
           'Message %s is missing required fields: %s' % (
           self.DESCRIPTOR.full_name, ','.join(self.FindInitializationErrors())))
-    return self.SerializePartialToString()
+    return self.SerializePartialToString(**kwargs)
   cls.SerializeToString = SerializeToString
 
 
 def _AddSerializePartialToStringMethod(message_descriptor, cls):
   """Helper for _AddMessageMethods()."""
 
-  def SerializePartialToString(self):
+  def SerializePartialToString(self, **kwargs):
     out = BytesIO()
-    self._InternalSerialize(out.write)
+    self._InternalSerialize(out.write, **kwargs)
     return out.getvalue()
   cls.SerializePartialToString = SerializePartialToString
 
-  def InternalSerialize(self, write_bytes):
-    for field_descriptor, field_value in self.ListFields():
-      field_descriptor._encoder(write_bytes, field_value)
-    for tag_bytes, value_bytes in self._unknown_fields:
-      write_bytes(tag_bytes)
-      write_bytes(value_bytes)
+  def InternalSerialize(self, write_bytes, deterministic=None):
+    if deterministic is None:
+      deterministic = (
+          api_implementation.IsPythonDefaultSerializationDeterministic())
+    else:
+      deterministic = bool(deterministic)
+
+    descriptor = self.DESCRIPTOR
+    if descriptor.GetOptions().map_entry:
+      # Fields of map entry should always be serialized.
+      descriptor.fields_by_name['key']._encoder(
+          write_bytes, self.key, deterministic)
+      descriptor.fields_by_name['value']._encoder(
+          write_bytes, self.value, deterministic)
+    else:
+      for field_descriptor, field_value in self.ListFields():
+        field_descriptor._encoder(write_bytes, field_value, deterministic)
+      for tag_bytes, value_bytes in self._unknown_fields:
+        write_bytes(tag_bytes)
+        write_bytes(value_bytes)
   cls._InternalSerialize = InternalSerialize
 
 
@@ -1117,7 +1109,8 @@
         new_pos = local_SkipField(buffer, new_pos, end, tag_bytes)
         if new_pos == -1:
           return pos
-        if not is_proto3:
+        if (not is_proto3 or
+            api_implementation.GetPythonProto3PreserveUnknownsDefault()):
           if not unknown_field_list:
             unknown_field_list = self._unknown_fields = []
           unknown_field_list.append(
@@ -1234,7 +1227,7 @@
     if not isinstance(msg, cls):
       raise TypeError(
           "Parameter to MergeFrom() must be instance of same class: "
-          "expected %s got %s." % (cls.__name__, type(msg).__name__))
+          'expected %s got %s.' % (cls.__name__, msg.__class__.__name__))
 
     assert msg is not self
     self._Modified()
@@ -1288,6 +1281,38 @@
   cls.WhichOneof = WhichOneof
 
 
+def _AddReduceMethod(cls):
+  def __reduce__(self):  # pylint: disable=invalid-name
+    return (type(self), (), self.__getstate__())
+  cls.__reduce__ = __reduce__
+
+
+def _Clear(self):
+  # Clear fields.
+  self._fields = {}
+  self._unknown_fields = ()
+  self._oneofs = {}
+  self._Modified()
+
+
+def _DiscardUnknownFields(self):
+  self._unknown_fields = []
+  for field, value in self.ListFields():
+    if field.cpp_type == _FieldDescriptor.CPPTYPE_MESSAGE:
+      if field.label == _FieldDescriptor.LABEL_REPEATED:
+        for sub_message in value:
+          sub_message.DiscardUnknownFields()
+      else:
+        value.DiscardUnknownFields()
+
+
+def _SetListener(self, listener):
+  if listener is None:
+    self._listener = message_listener_mod.NullMessageListener()
+  else:
+    self._listener = listener
+
+
 def _AddMessageMethods(message_descriptor, cls):
   """Adds implementations of all Message methods to cls."""
   _AddListFieldsMethod(message_descriptor, cls)
@@ -1296,12 +1321,10 @@
   if message_descriptor.is_extendable:
     _AddClearExtensionMethod(cls)
     _AddHasExtensionMethod(cls)
-  _AddClearMethod(message_descriptor, cls)
   _AddEqualsMethod(message_descriptor, cls)
   _AddStrMethod(message_descriptor, cls)
   _AddReprMethod(message_descriptor, cls)
   _AddUnicodeMethod(message_descriptor, cls)
-  _AddSetListenerMethod(cls)
   _AddByteSizeMethod(message_descriptor, cls)
   _AddSerializeToStringMethod(message_descriptor, cls)
   _AddSerializePartialToStringMethod(message_descriptor, cls)
@@ -1309,6 +1332,11 @@
   _AddIsInitializedMethod(message_descriptor, cls)
   _AddMergeFromMethod(cls)
   _AddWhichOneofMethod(message_descriptor, cls)
+  _AddReduceMethod(cls)
+  # Adds methods which do not depend on cls.
+  cls.Clear = _Clear
+  cls.DiscardUnknownFields = _DiscardUnknownFields
+  cls._SetListener = _SetListener
 
 
 def _AddPrivateHelperMethods(message_descriptor, cls):
@@ -1518,3 +1546,14 @@
       Extension field descriptor.
     """
     return self._extended_message._extensions_by_name.get(name, None)
+
+  def _FindExtensionByNumber(self, number):
+    """Tries to find a known extension with the field number.
+
+    Args:
+      number: Extension field number.
+
+    Returns:
+      Extension field descriptor.
+    """
+    return self._extended_message._extensions_by_number.get(number, None)
diff --git a/python/google/protobuf/internal/python_protobuf.cc b/python/google/protobuf/internal/python_protobuf.cc
new file mode 100644
index 0000000..f90cc43
--- /dev/null
+++ b/python/google/protobuf/internal/python_protobuf.cc
@@ -0,0 +1,63 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// Author: qrczak@google.com (Marcin Kowalczyk)
+
+#include <google/protobuf/python/python_protobuf.h>
+
+namespace google {
+namespace protobuf {
+namespace python {
+
+static const Message* GetCProtoInsidePyProtoStub(PyObject* msg) {
+  return NULL;
+}
+static Message* MutableCProtoInsidePyProtoStub(PyObject* msg) {
+  return NULL;
+}
+
+// This is initialized with a default, stub implementation.
+// If python-google.protobuf.cc is loaded, the function pointer is overridden
+// with a full implementation.
+const Message* (*GetCProtoInsidePyProtoPtr)(PyObject* msg) =
+    GetCProtoInsidePyProtoStub;
+Message* (*MutableCProtoInsidePyProtoPtr)(PyObject* msg) =
+    MutableCProtoInsidePyProtoStub;
+
+const Message* GetCProtoInsidePyProto(PyObject* msg) {
+  return GetCProtoInsidePyProtoPtr(msg);
+}
+Message* MutableCProtoInsidePyProto(PyObject* msg) {
+  return MutableCProtoInsidePyProtoPtr(msg);
+}
+
+}  // namespace python
+}  // namespace protobuf
+}  // namespace google
diff --git a/python/google/protobuf/internal/reflection_test.py b/python/google/protobuf/internal/reflection_test.py
index 752f2f5..0306ff4 100755
--- a/python/google/protobuf/internal/reflection_test.py
+++ b/python/google/protobuf/internal/reflection_test.py
@@ -42,9 +42,10 @@
 import struct
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import unittest_import_pb2
 from google.protobuf import unittest_mset_pb2
 from google.protobuf import unittest_pb2
@@ -59,9 +60,13 @@
 from google.protobuf.internal import message_set_extensions_pb2
 from google.protobuf.internal import wire_format
 from google.protobuf.internal import test_util
+from google.protobuf.internal import testing_refleaks
 from google.protobuf.internal import decoder
 
 
+BaseTestCase = testing_refleaks.BaseTestCase
+
+
 class _MiniDecoder(object):
   """Decodes a stream of values from a string.
 
@@ -94,12 +99,12 @@
     return wire_format.UnpackTag(self.ReadVarint())
 
   def ReadFloat(self):
-    result = struct.unpack("<f", self._bytes[self._pos:self._pos+4])[0]
+    result = struct.unpack('<f', self._bytes[self._pos:self._pos+4])[0]
     self._pos += 4
     return result
 
   def ReadDouble(self):
-    result = struct.unpack("<d", self._bytes[self._pos:self._pos+8])[0]
+    result = struct.unpack('<d', self._bytes[self._pos:self._pos+8])[0]
     self._pos += 8
     return result
 
@@ -107,7 +112,7 @@
     return self._pos == len(self._bytes)
 
 
-class ReflectionTest(unittest.TestCase):
+class ReflectionTest(BaseTestCase):
 
   def assertListsEqual(self, values, others):
     self.assertEqual(len(values), len(others))
@@ -119,11 +124,13 @@
     proto = unittest_pb2.TestAllTypes(
         optional_int32=24,
         optional_double=54.321,
-        optional_string='optional_string')
+        optional_string='optional_string',
+        optional_float=None)
 
     self.assertEqual(24, proto.optional_int32)
     self.assertEqual(54.321, proto.optional_double)
     self.assertEqual('optional_string', proto.optional_string)
+    self.assertFalse(proto.HasField("optional_float"))
 
   def testRepeatedScalarConstructor(self):
     # Constructor with only repeated scalar types should succeed.
@@ -131,12 +138,14 @@
         repeated_int32=[1, 2, 3, 4],
         repeated_double=[1.23, 54.321],
         repeated_bool=[True, False, False],
-        repeated_string=["optional_string"])
+        repeated_string=["optional_string"],
+        repeated_float=None)
 
     self.assertEqual([1, 2, 3, 4], list(proto.repeated_int32))
     self.assertEqual([1.23, 54.321], list(proto.repeated_double))
     self.assertEqual([True, False, False], list(proto.repeated_bool))
     self.assertEqual(["optional_string"], list(proto.repeated_string))
+    self.assertEqual([], list(proto.repeated_float))
 
   def testRepeatedCompositeConstructor(self):
     # Constructor with only repeated composite types should succeed.
@@ -187,7 +196,8 @@
         repeated_foreign_message=[
             unittest_pb2.ForeignMessage(c=-43),
             unittest_pb2.ForeignMessage(c=45324),
-            unittest_pb2.ForeignMessage(c=12)])
+            unittest_pb2.ForeignMessage(c=12)],
+        optional_nested_message=None)
 
     self.assertEqual(24, proto.optional_int32)
     self.assertEqual('optional_string', proto.optional_string)
@@ -204,6 +214,7 @@
          unittest_pb2.ForeignMessage(c=45324),
          unittest_pb2.ForeignMessage(c=12)],
         list(proto.repeated_foreign_message))
+    self.assertFalse(proto.HasField("optional_nested_message"))
 
   def testConstructorTypeError(self):
     self.assertRaises(
@@ -609,10 +620,24 @@
     self.assertRaises(TypeError, setattr, proto, 'optional_int32', 'foo')
     self.assertRaises(TypeError, setattr, proto, 'optional_string', 10)
     self.assertRaises(TypeError, setattr, proto, 'optional_bytes', 10)
+    self.assertRaises(TypeError, setattr, proto, 'optional_bool', 'foo')
+    self.assertRaises(TypeError, setattr, proto, 'optional_float', 'foo')
+    self.assertRaises(TypeError, setattr, proto, 'optional_double', 'foo')
+    # TODO(jieluo): Fix type checking difference for python and c extension
+    if api_implementation.Type() == 'python':
+      self.assertRaises(TypeError, setattr, proto, 'optional_bool', 1.1)
+    else:
+      proto.optional_bool = 1.1
 
-  def testIntegerTypes(self):
+  def assertIntegerTypes(self, integer_fn):
+    """Verifies setting of scalar integers.
+
+    Args:
+      integer_fn: A function to wrap the integers that will be assigned.
+    """
     def TestGetAndDeserialize(field_name, value, expected_type):
       proto = unittest_pb2.TestAllTypes()
+      value = integer_fn(value)
       setattr(proto, field_name, value)
       self.assertIsInstance(getattr(proto, field_name), expected_type)
       proto2 = unittest_pb2.TestAllTypes()
@@ -624,12 +649,12 @@
     TestGetAndDeserialize('optional_uint32', 1 << 30, int)
     try:
       integer_64 = long
-    except NameError: # Python3
+    except NameError:  # Python3
       integer_64 = int
     if struct.calcsize('L') == 4:
       # Python only has signed ints, so 32-bit python can't fit an uint32
       # in an int.
-      TestGetAndDeserialize('optional_uint32', 1 << 31, long)
+      TestGetAndDeserialize('optional_uint32', 1 << 31, integer_64)
     else:
       # 64-bit python can fit uint32 inside an int
       TestGetAndDeserialize('optional_uint32', 1 << 31, int)
@@ -638,25 +663,62 @@
     TestGetAndDeserialize('optional_uint64', 1 << 30, integer_64)
     TestGetAndDeserialize('optional_uint64', 1 << 60, integer_64)
 
-  def testSingleScalarBoundsChecking(self):
+  def testIntegerTypes(self):
+    self.assertIntegerTypes(lambda x: x)
+
+  def testNonStandardIntegerTypes(self):
+    self.assertIntegerTypes(test_util.NonStandardInteger)
+
+  def testIllegalValuesForIntegers(self):
+    pb = unittest_pb2.TestAllTypes()
+
+    # Strings are illegal, even when the represent an integer.
+    with self.assertRaises(TypeError):
+      pb.optional_uint64 = '2'
+
+    # The exact error should propagate with a poorly written custom integer.
+    with self.assertRaisesRegexp(RuntimeError, 'my_error'):
+      pb.optional_uint64 = test_util.NonStandardInteger(5, 'my_error')
+
+  def assetIntegerBoundsChecking(self, integer_fn):
+    """Verifies bounds checking for scalar integer fields.
+
+    Args:
+      integer_fn: A function to wrap the integers that will be assigned.
+    """
     def TestMinAndMaxIntegers(field_name, expected_min, expected_max):
       pb = unittest_pb2.TestAllTypes()
+      expected_min = integer_fn(expected_min)
+      expected_max = integer_fn(expected_max)
       setattr(pb, field_name, expected_min)
       self.assertEqual(expected_min, getattr(pb, field_name))
       setattr(pb, field_name, expected_max)
       self.assertEqual(expected_max, getattr(pb, field_name))
-      self.assertRaises(ValueError, setattr, pb, field_name, expected_min - 1)
-      self.assertRaises(ValueError, setattr, pb, field_name, expected_max + 1)
+      self.assertRaises((ValueError, TypeError), setattr, pb, field_name,
+                        expected_min - 1)
+      self.assertRaises((ValueError, TypeError), setattr, pb, field_name,
+                        expected_max + 1)
 
     TestMinAndMaxIntegers('optional_int32', -(1 << 31), (1 << 31) - 1)
     TestMinAndMaxIntegers('optional_uint32', 0, 0xffffffff)
     TestMinAndMaxIntegers('optional_int64', -(1 << 63), (1 << 63) - 1)
     TestMinAndMaxIntegers('optional_uint64', 0, 0xffffffffffffffff)
+    # A bit of white-box testing since -1 is an int and not a long in C++ and
+    # so goes down a different path.
+    pb = unittest_pb2.TestAllTypes()
+    with self.assertRaises((ValueError, TypeError)):
+      pb.optional_uint64 = integer_fn(-(1 << 63))
 
     pb = unittest_pb2.TestAllTypes()
-    pb.optional_nested_enum = 1
+    pb.optional_nested_enum = integer_fn(1)
     self.assertEqual(1, pb.optional_nested_enum)
 
+  def testSingleScalarBoundsChecking(self):
+    self.assetIntegerBoundsChecking(lambda x: x)
+
+  def testNonStandardSingleScalarBoundsChecking(self):
+    self.assetIntegerBoundsChecking(test_util.NonStandardInteger)
+
   def testRepeatedScalarTypeSafety(self):
     proto = unittest_pb2.TestAllTypes()
     self.assertRaises(TypeError, proto.repeated_int32.append, 1.1)
@@ -668,6 +730,12 @@
     proto.repeated_int32[0] = 23
     self.assertRaises(IndexError, proto.repeated_int32.__setitem__, 500, 23)
     self.assertRaises(TypeError, proto.repeated_int32.__setitem__, 0, 'abc')
+    self.assertRaises(TypeError, proto.repeated_int32.__setitem__, 0, [])
+    self.assertRaises(TypeError, proto.repeated_int32.__setitem__,
+                      'index', 23)
+
+    proto.repeated_string.append('2')
+    self.assertRaises(TypeError, proto.repeated_string.__setitem__, 0, 10)
 
     # Repeated enums tests.
     #proto.repeated_nested_enum.append(0)
@@ -955,6 +1023,14 @@
     self.assertEqual(4, len(proto.repeated_nested_message))
     self.assertEqual(n1, proto.repeated_nested_message[2])
     self.assertEqual(n2, proto.repeated_nested_message[3])
+    self.assertRaises(TypeError,
+                      proto.repeated_nested_message.extend, n1)
+    self.assertRaises(TypeError,
+                      proto.repeated_nested_message.extend, [0])
+    wrong_message_type = unittest_pb2.TestAllTypes()
+    self.assertRaises(TypeError,
+                      proto.repeated_nested_message.extend,
+                      [wrong_message_type])
 
     # Test clearing.
     proto.ClearField('repeated_nested_message')
@@ -965,6 +1041,9 @@
     proto.repeated_nested_message.add(bb=23)
     self.assertEqual(1, len(proto.repeated_nested_message))
     self.assertEqual(23, proto.repeated_nested_message[0].bb)
+    self.assertRaises(TypeError, proto.repeated_nested_message.add, 23)
+    with self.assertRaises(Exception):
+      proto.repeated_nested_message[0] = 23
 
   def testRepeatedCompositeRemove(self):
     proto = unittest_pb2.TestAllTypes()
@@ -1175,12 +1254,18 @@
     self.assertTrue(not extendee_proto.HasExtension(extension))
 
   def testRegisteredExtensions(self):
-    self.assertTrue('protobuf_unittest.optional_int32_extension' in
-                    unittest_pb2.TestAllExtensions._extensions_by_name)
-    self.assertTrue(1 in unittest_pb2.TestAllExtensions._extensions_by_number)
+    pool = unittest_pb2.DESCRIPTOR.pool
+    self.assertTrue(
+        pool.FindExtensionByNumber(
+            unittest_pb2.TestAllExtensions.DESCRIPTOR, 1))
+    self.assertIs(
+        pool.FindExtensionByName(
+            'protobuf_unittest.optional_int32_extension').containing_type,
+        unittest_pb2.TestAllExtensions.DESCRIPTOR)
     # Make sure extensions haven't been registered into types that shouldn't
     # have any.
-    self.assertEqual(0, len(unittest_pb2.TestAllTypes._extensions_by_name))
+    self.assertEqual(0, len(
+        pool.FindAllExtensions(unittest_pb2.TestAllTypes.DESCRIPTOR)))
 
   # If message A directly contains message B, and
   # a.HasField('b') is currently False, then mutating any
@@ -1492,7 +1577,14 @@
     container = copy.deepcopy(proto1.repeated_int32)
     self.assertEqual([2, 3], container)
 
-    # TODO(anuraag): Implement deepcopy for repeated composite / extension dict
+    message1 = proto1.repeated_nested_message.add()
+    message1.bb = 1
+    messages = copy.deepcopy(proto1.repeated_nested_message)
+    self.assertEqual(proto1.repeated_nested_message, messages)
+    message1.bb = 2
+    self.assertNotEqual(proto1.repeated_nested_message, messages)
+
+    # TODO(anuraag): Implement deepcopy for extension dict
 
   def testClear(self):
     proto = unittest_pb2.TestAllTypes()
@@ -1544,6 +1636,20 @@
     self.assertFalse(proto.HasField('optional_foreign_message'))
     self.assertEqual(0, proto.optional_foreign_message.c)
 
+  def testDisconnectingInOneof(self):
+    m = unittest_pb2.TestOneof2()  # This message has two messages in a oneof.
+    m.foo_message.qux_int = 5
+    sub_message = m.foo_message
+    # Accessing another message's field does not clear the first one
+    self.assertEqual(m.foo_lazy_message.qux_int, 0)
+    self.assertEqual(m.foo_message.qux_int, 5)
+    # But mutating another message in the oneof detaches the first one.
+    m.foo_lazy_message.qux_int = 6
+    self.assertEqual(m.foo_message.qux_int, 0)
+    # The reference we got above was detached and is still valid.
+    self.assertEqual(sub_message.qux_int, 5)
+    sub_message.qux_int = 7
+
   def testOneOf(self):
     proto = unittest_pb2.TestAllTypes()
     proto.oneof_uint32 = 10
@@ -1562,8 +1668,11 @@
     proto.SerializeToString()
     proto.SerializePartialToString()
 
-  def assertNotInitialized(self, proto):
+  def assertNotInitialized(self, proto, error_size=None):
+    errors = []
     self.assertFalse(proto.IsInitialized())
+    self.assertFalse(proto.IsInitialized(errors))
+    self.assertEqual(error_size, len(errors))
     self.assertRaises(message.EncodeError, proto.SerializeToString)
     # "Partial" serialization doesn't care if message is uninitialized.
     proto.SerializePartialToString()
@@ -1577,7 +1686,7 @@
 
     # The case of uninitialized required fields.
     proto = unittest_pb2.TestRequired()
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 3)
     proto.a = proto.b = proto.c = 2
     self.assertInitialized(proto)
 
@@ -1585,14 +1694,14 @@
     proto = unittest_pb2.TestRequiredForeign()
     self.assertInitialized(proto)
     proto.optional_message.a = 1
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 2)
     proto.optional_message.b = 0
     proto.optional_message.c = 0
     self.assertInitialized(proto)
 
     # Uninitialized repeated submessage.
     message1 = proto.repeated_message.add()
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 3)
     message1.a = message1.b = message1.c = 0
     self.assertInitialized(proto)
 
@@ -1601,11 +1710,11 @@
     extension = unittest_pb2.TestRequired.multi
     message1 = proto.Extensions[extension].add()
     message2 = proto.Extensions[extension].add()
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 6)
     message1.a = 1
     message1.b = 1
     message1.c = 1
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 3)
     message2.a = 2
     message2.b = 2
     message2.c = 2
@@ -1615,7 +1724,7 @@
     proto = unittest_pb2.TestAllExtensions()
     extension = unittest_pb2.TestRequired.single
     proto.Extensions[extension].a = 1
-    self.assertNotInitialized(proto)
+    self.assertNotInitialized(proto, 2)
     proto.Extensions[extension].b = 2
     proto.Extensions[extension].c = 3
     self.assertInitialized(proto)
@@ -1802,7 +1911,7 @@
 #  into separate TestCase classes.
 
 
-class TestAllTypesEqualityTest(unittest.TestCase):
+class TestAllTypesEqualityTest(BaseTestCase):
 
   def setUp(self):
     self.first_proto = unittest_pb2.TestAllTypes()
@@ -1818,7 +1927,7 @@
     self.assertEqual(self.first_proto, self.second_proto)
 
 
-class FullProtosEqualityTest(unittest.TestCase):
+class FullProtosEqualityTest(BaseTestCase):
 
   """Equality tests using completely-full protos as a starting point."""
 
@@ -1904,7 +2013,7 @@
     self.assertEqual(self.first_proto, self.second_proto)
 
 
-class ExtensionEqualityTest(unittest.TestCase):
+class ExtensionEqualityTest(BaseTestCase):
 
   def testExtensionEquality(self):
     first_proto = unittest_pb2.TestAllExtensions()
@@ -1937,7 +2046,7 @@
     self.assertEqual(first_proto, second_proto)
 
 
-class MutualRecursionEqualityTest(unittest.TestCase):
+class MutualRecursionEqualityTest(BaseTestCase):
 
   def testEqualityWithMutualRecursion(self):
     first_proto = unittest_pb2.TestMutualRecursionA()
@@ -1949,7 +2058,7 @@
     self.assertEqual(first_proto, second_proto)
 
 
-class ByteSizeTest(unittest.TestCase):
+class ByteSizeTest(BaseTestCase):
 
   def setUp(self):
     self.proto = unittest_pb2.TestAllTypes()
@@ -2074,6 +2183,8 @@
     foreign_message_1 = self.proto.repeated_nested_message.add()
     foreign_message_1.bb = 9
     self.assertEqual(2 + 1 + 2 + 1 + 1 + 1, self.Size())
+    repeated_nested_message = copy.deepcopy(
+        self.proto.repeated_nested_message)
 
     # 2 bytes tag plus 1 byte length plus 1 byte bb tag 1 byte int.
     del self.proto.repeated_nested_message[0]
@@ -2094,6 +2205,16 @@
     del self.proto.repeated_nested_message[0]
     self.assertEqual(0, self.Size())
 
+    self.assertEqual(2, len(repeated_nested_message))
+    del repeated_nested_message[0:1]
+    # TODO(jieluo): Fix cpp extension bug when delete repeated message.
+    if api_implementation.Type() == 'python':
+      self.assertEqual(1, len(repeated_nested_message))
+    del repeated_nested_message[-1]
+    # TODO(jieluo): Fix cpp extension bug when delete repeated message.
+    if api_implementation.Type() == 'python':
+      self.assertEqual(0, len(repeated_nested_message))
+
   def testRepeatedGroups(self):
     # 2-byte START_GROUP plus 2-byte END_GROUP.
     group_0 = self.proto.repeatedgroup.add()
@@ -2110,6 +2231,10 @@
     proto.Extensions[extension] = 23
     # 1 byte for tag, 1 byte for value.
     self.assertEqual(2, proto.ByteSize())
+    field = unittest_pb2.TestAllTypes.DESCRIPTOR.fields_by_name[
+        'optional_int32']
+    with self.assertRaises(KeyError):
+      proto.Extensions[field] = 23
 
   def testCacheInvalidationForNonrepeatedScalar(self):
     # Test non-extension.
@@ -2245,7 +2370,7 @@
 #   * Handling of empty submessages (with and without "has"
 #     bits set).
 
-class SerializationTest(unittest.TestCase):
+class SerializationTest(BaseTestCase):
 
   def testSerializeEmtpyMessage(self):
     first_proto = unittest_pb2.TestAllTypes()
@@ -2806,7 +2931,7 @@
     self.assertEqual(3, proto.repeated_int32[2])
 
 
-class OptionsTest(unittest.TestCase):
+class OptionsTest(BaseTestCase):
 
   def testMessageOptions(self):
     proto = message_set_extensions_pb2.TestMessageSet()
@@ -2833,7 +2958,7 @@
 
 
 
-class ClassAPITest(unittest.TestCase):
+class ClassAPITest(BaseTestCase):
 
   @unittest.skipIf(
       api_implementation.Type() == 'cpp' and api_implementation.Version() == 2,
@@ -2916,6 +3041,9 @@
     text_format.Merge(file_descriptor_str, file_descriptor)
     return file_descriptor.SerializeToString()
 
+  @testing_refleaks.SkipReferenceLeakChecker('MakeDescriptor is not repeatable')
+  # This test can only run once; the second time, it raises errors about
+  # conflicting message descriptors.
   def testParsingFlatClassWithExplicitClassDeclaration(self):
     """Test that the generated class can parse a flat message."""
     # TODO(xiaofeng): This test fails with cpp implemetnation in the call
@@ -2940,6 +3068,7 @@
     text_format.Merge(msg_str, msg)
     self.assertEqual(msg.flat, [0, 1, 2])
 
+  @testing_refleaks.SkipReferenceLeakChecker('MakeDescriptor is not repeatable')
   def testParsingFlatClass(self):
     """Test that the generated class can parse a flat message."""
     file_descriptor = descriptor_pb2.FileDescriptorProto()
@@ -2955,6 +3084,7 @@
     text_format.Merge(msg_str, msg)
     self.assertEqual(msg.flat, [0, 1, 2])
 
+  @testing_refleaks.SkipReferenceLeakChecker('MakeDescriptor is not repeatable')
   def testParsingNestedClass(self):
     """Test that the generated class can parse a nested message."""
     file_descriptor = descriptor_pb2.FileDescriptorProto()
diff --git a/python/google/protobuf/internal/service_reflection_test.py b/python/google/protobuf/internal/service_reflection_test.py
index 98614b7..77239f4 100755
--- a/python/google/protobuf/internal/service_reflection_test.py
+++ b/python/google/protobuf/internal/service_reflection_test.py
@@ -34,10 +34,12 @@
 
 __author__ = 'petar@google.com (Petar Petrov)'
 
+
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import unittest_pb2
 from google.protobuf import service_reflection
 from google.protobuf import service
@@ -80,6 +82,10 @@
     service_descriptor = unittest_pb2.TestService.GetDescriptor()
     srvc.CallMethod(service_descriptor.methods[1], rpc_controller,
                     unittest_pb2.BarRequest(), MyCallback)
+    self.assertTrue(srvc.GetRequestClass(service_descriptor.methods[1]) is
+                    unittest_pb2.BarRequest)
+    self.assertTrue(srvc.GetResponseClass(service_descriptor.methods[1]) is
+                    unittest_pb2.BarResponse)
     self.assertEqual('Method Bar not implemented.',
                      rpc_controller.failure_message)
     self.assertEqual(None, self.callback_response)
diff --git a/python/google/protobuf/internal/symbol_database_test.py b/python/google/protobuf/internal/symbol_database_test.py
index 0cb935a..af42681 100644
--- a/python/google/protobuf/internal/symbol_database_test.py
+++ b/python/google/protobuf/internal/symbol_database_test.py
@@ -33,31 +33,35 @@
 """Tests for google.protobuf.symbol_database."""
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import unittest_pb2
 from google.protobuf import descriptor
+from google.protobuf import descriptor_pool
 from google.protobuf import symbol_database
 
+
 class SymbolDatabaseTest(unittest.TestCase):
 
   def _Database(self):
-    # TODO(b/17734095): Remove this difference when the C++ implementation
-    # supports multiple databases.
     if descriptor._USE_C_DESCRIPTORS:
-      return symbol_database.Default()
+      # The C++ implementation does not allow mixing descriptors from
+      # different pools.
+      db = symbol_database.SymbolDatabase(pool=descriptor_pool.Default())
     else:
       db = symbol_database.SymbolDatabase()
-      # Register representative types from unittest_pb2.
-      db.RegisterFileDescriptor(unittest_pb2.DESCRIPTOR)
-      db.RegisterMessage(unittest_pb2.TestAllTypes)
-      db.RegisterMessage(unittest_pb2.TestAllTypes.NestedMessage)
-      db.RegisterMessage(unittest_pb2.TestAllTypes.OptionalGroup)
-      db.RegisterMessage(unittest_pb2.TestAllTypes.RepeatedGroup)
-      db.RegisterEnumDescriptor(unittest_pb2.ForeignEnum.DESCRIPTOR)
-      db.RegisterEnumDescriptor(unittest_pb2.TestAllTypes.NestedEnum.DESCRIPTOR)
-      return db
+    # Register representative types from unittest_pb2.
+    db.RegisterFileDescriptor(unittest_pb2.DESCRIPTOR)
+    db.RegisterMessage(unittest_pb2.TestAllTypes)
+    db.RegisterMessage(unittest_pb2.TestAllTypes.NestedMessage)
+    db.RegisterMessage(unittest_pb2.TestAllTypes.OptionalGroup)
+    db.RegisterMessage(unittest_pb2.TestAllTypes.RepeatedGroup)
+    db.RegisterEnumDescriptor(unittest_pb2.ForeignEnum.DESCRIPTOR)
+    db.RegisterEnumDescriptor(unittest_pb2.TestAllTypes.NestedEnum.DESCRIPTOR)
+    db.RegisterServiceDescriptor(unittest_pb2._TESTSERVICE)
+    return db
 
   def testGetPrototype(self):
     instance = self._Database().GetPrototype(
@@ -106,7 +110,13 @@
         self._Database().pool.FindMessageTypeByName(
             'protobuf_unittest.TestAllTypes.NestedMessage').full_name)
 
-  def testFindFindContainingSymbol(self):
+  def testFindServiceByName(self):
+    self.assertEqual(
+        'protobuf_unittest.TestService',
+        self._Database().pool.FindServiceByName(
+            'protobuf_unittest.TestService').full_name)
+
+  def testFindFileContainingSymbol(self):
     # Lookup based on either enum or message.
     self.assertEqual(
         'google/protobuf/unittest.proto',
diff --git a/python/google/protobuf/internal/test_util.py b/python/google/protobuf/internal/test_util.py
index ac88fa8..a6e34ef 100755
--- a/python/google/protobuf/internal/test_util.py
+++ b/python/google/protobuf/internal/test_util.py
@@ -36,11 +36,18 @@
 
 __author__ = 'robinson@google.com (Will Robinson)'
 
+import numbers
+import operator
 import os.path
 
 from google.protobuf import unittest_import_pb2
 from google.protobuf import unittest_pb2
-from google.protobuf import descriptor_pb2
+
+try:
+  long        # Python 2
+except NameError:
+  long = int  # Python 3
+
 
 # Tests whether the given TestAllTypes message is proto2 or not.
 # This is used to gate several fields/features that only exist
@@ -48,6 +55,7 @@
 def IsProto2(message):
   return message.DESCRIPTOR.syntax == "proto2"
 
+
 def SetAllNonLazyFields(message):
   """Sets every non-lazy field in the message to a unique value.
 
@@ -125,22 +133,37 @@
   message.repeated_string_piece.append(u'224')
   message.repeated_cord.append(u'225')
 
-  # Add a second one of each field.
-  message.repeated_int32.append(301)
-  message.repeated_int64.append(302)
-  message.repeated_uint32.append(303)
-  message.repeated_uint64.append(304)
-  message.repeated_sint32.append(305)
-  message.repeated_sint64.append(306)
-  message.repeated_fixed32.append(307)
-  message.repeated_fixed64.append(308)
-  message.repeated_sfixed32.append(309)
-  message.repeated_sfixed64.append(310)
-  message.repeated_float.append(311)
-  message.repeated_double.append(312)
-  message.repeated_bool.append(False)
-  message.repeated_string.append(u'315')
-  message.repeated_bytes.append(b'316')
+  # Add a second one of each field and set value by index.
+  message.repeated_int32.append(0)
+  message.repeated_int64.append(0)
+  message.repeated_uint32.append(0)
+  message.repeated_uint64.append(0)
+  message.repeated_sint32.append(0)
+  message.repeated_sint64.append(0)
+  message.repeated_fixed32.append(0)
+  message.repeated_fixed64.append(0)
+  message.repeated_sfixed32.append(0)
+  message.repeated_sfixed64.append(0)
+  message.repeated_float.append(0)
+  message.repeated_double.append(0)
+  message.repeated_bool.append(True)
+  message.repeated_string.append(u'0')
+  message.repeated_bytes.append(b'0')
+  message.repeated_int32[1] = 301
+  message.repeated_int64[1] = 302
+  message.repeated_uint32[1] = 303
+  message.repeated_uint64[1] = 304
+  message.repeated_sint32[1] = 305
+  message.repeated_sint64[1] = 306
+  message.repeated_fixed32[1] = 307
+  message.repeated_fixed64[1] = 308
+  message.repeated_sfixed32[1] = 309
+  message.repeated_sfixed64[1] = 310
+  message.repeated_float[1] = 311
+  message.repeated_double[1] = 312
+  message.repeated_bool[1] = False
+  message.repeated_string[1] = u'315'
+  message.repeated_bytes[1] = b'316'
 
   if IsProto2(message):
     message.repeatedgroup.add().a = 317
@@ -149,7 +172,8 @@
   message.repeated_import_message.add().d = 320
   message.repeated_lazy_message.add().bb = 327
 
-  message.repeated_nested_enum.append(unittest_pb2.TestAllTypes.BAZ)
+  message.repeated_nested_enum.append(unittest_pb2.TestAllTypes.BAR)
+  message.repeated_nested_enum[1] = unittest_pb2.TestAllTypes.BAZ
   message.repeated_foreign_enum.append(unittest_pb2.FOREIGN_BAZ)
   if IsProto2(message):
     message.repeated_import_enum.append(unittest_import_pb2.IMPORT_BAZ)
@@ -692,3 +716,153 @@
   message.unpacked_bool.extend([True, False])
   message.unpacked_enum.extend([unittest_pb2.FOREIGN_BAR,
                                 unittest_pb2.FOREIGN_BAZ])
+
+
+class NonStandardInteger(numbers.Integral):
+  """An integer object that does not subclass int.
+
+  This is used to verify that both C++ and regular proto systems can handle
+  integer others than int and long and that they handle them in predictable
+  ways.
+
+  NonStandardInteger is the minimal legal specification for a custom Integral.
+  As such, it does not support 0 < x < 5 and it is not hashable.
+
+  Note: This is added here instead of relying on numpy or a similar library
+  with custom integers to limit dependencies.
+  """
+
+  def __init__(self, val, error_string_on_conversion=None):
+    assert isinstance(val, numbers.Integral)
+    if isinstance(val, NonStandardInteger):
+      val = val.val
+    self.val = val
+    self.error_string_on_conversion = error_string_on_conversion
+
+  def __long__(self):
+    if self.error_string_on_conversion:
+      raise RuntimeError(self.error_string_on_conversion)
+    return long(self.val)
+
+  def __abs__(self):
+    return NonStandardInteger(operator.abs(self.val))
+
+  def __add__(self, y):
+    return NonStandardInteger(operator.add(self.val, y))
+
+  def __div__(self, y):
+    return NonStandardInteger(operator.div(self.val, y))
+
+  def __eq__(self, y):
+    return operator.eq(self.val, y)
+
+  def __floordiv__(self, y):
+    return NonStandardInteger(operator.floordiv(self.val, y))
+
+  def __truediv__(self, y):
+    return NonStandardInteger(operator.truediv(self.val, y))
+
+  def __invert__(self):
+    return NonStandardInteger(operator.invert(self.val))
+
+  def __mod__(self, y):
+    return NonStandardInteger(operator.mod(self.val, y))
+
+  def __mul__(self, y):
+    return NonStandardInteger(operator.mul(self.val, y))
+
+  def __neg__(self):
+    return NonStandardInteger(operator.neg(self.val))
+
+  def __pos__(self):
+    return NonStandardInteger(operator.pos(self.val))
+
+  def __pow__(self, y):
+    return NonStandardInteger(operator.pow(self.val, y))
+
+  def __trunc__(self):
+    return int(self.val)
+
+  def __radd__(self, y):
+    return NonStandardInteger(operator.add(y, self.val))
+
+  def __rdiv__(self, y):
+    return NonStandardInteger(operator.div(y, self.val))
+
+  def __rmod__(self, y):
+    return NonStandardInteger(operator.mod(y, self.val))
+
+  def __rmul__(self, y):
+    return NonStandardInteger(operator.mul(y, self.val))
+
+  def __rpow__(self, y):
+    return NonStandardInteger(operator.pow(y, self.val))
+
+  def __rfloordiv__(self, y):
+    return NonStandardInteger(operator.floordiv(y, self.val))
+
+  def __rtruediv__(self, y):
+    return NonStandardInteger(operator.truediv(y, self.val))
+
+  def __lshift__(self, y):
+    return NonStandardInteger(operator.lshift(self.val, y))
+
+  def __rshift__(self, y):
+    return NonStandardInteger(operator.rshift(self.val, y))
+
+  def __rlshift__(self, y):
+    return NonStandardInteger(operator.lshift(y, self.val))
+
+  def __rrshift__(self, y):
+    return NonStandardInteger(operator.rshift(y, self.val))
+
+  def __le__(self, y):
+    if isinstance(y, NonStandardInteger):
+      y = y.val
+    return operator.le(self.val, y)
+
+  def __lt__(self, y):
+    if isinstance(y, NonStandardInteger):
+      y = y.val
+    return operator.lt(self.val, y)
+
+  def __and__(self, y):
+    return NonStandardInteger(operator.and_(self.val, y))
+
+  def __or__(self, y):
+    return NonStandardInteger(operator.or_(self.val, y))
+
+  def __xor__(self, y):
+    return NonStandardInteger(operator.xor(self.val, y))
+
+  def __rand__(self, y):
+    return NonStandardInteger(operator.and_(y, self.val))
+
+  def __ror__(self, y):
+    return NonStandardInteger(operator.or_(y, self.val))
+
+  def __rxor__(self, y):
+    return NonStandardInteger(operator.xor(y, self.val))
+
+  def __bool__(self):
+    return self.val
+
+  def __nonzero__(self):
+    return self.val
+
+  def __ceil__(self):
+    return self
+
+  def __floor__(self):
+    return self
+
+  def __int__(self):
+    if self.error_string_on_conversion:
+      raise RuntimeError(self.error_string_on_conversion)
+    return int(self.val)
+
+  def __round__(self):
+    return self
+
+  def __repr__(self):
+    return 'NonStandardInteger(%s)' % self.val
diff --git a/python/google/protobuf/internal/testing_refleaks.py b/python/google/protobuf/internal/testing_refleaks.py
new file mode 100644
index 0000000..8ce0651
--- /dev/null
+++ b/python/google/protobuf/internal/testing_refleaks.py
@@ -0,0 +1,126 @@
+# Protocol Buffers - Google's data interchange format
+# Copyright 2008 Google Inc.  All rights reserved.
+# https://developers.google.com/protocol-buffers/
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+#     * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+#     * Redistributions in binary form must reproduce the above
+# copyright notice, this list of conditions and the following disclaimer
+# in the documentation and/or other materials provided with the
+# distribution.
+#     * Neither the name of Google Inc. nor the names of its
+# contributors may be used to endorse or promote products derived from
+# this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+"""A subclass of unittest.TestCase which checks for reference leaks.
+
+To use:
+- Use testing_refleak.BaseTestCase instead of unittest.TestCase
+- Configure and compile Python with --with-pydebug
+
+If sys.gettotalrefcount() is not available (because Python was built without
+the Py_DEBUG option), then this module is a no-op and tests will run normally.
+"""
+
+import gc
+import sys
+
+try:
+  import copy_reg as copyreg  #PY26
+except ImportError:
+  import copyreg
+
+try:
+  import unittest2 as unittest  #PY26
+except ImportError:
+  import unittest
+
+
+class LocalTestResult(unittest.TestResult):
+  """A TestResult which forwards events to a parent object, except for Skips."""
+
+  def __init__(self, parent_result):
+    unittest.TestResult.__init__(self)
+    self.parent_result = parent_result
+
+  def addError(self, test, error):
+    self.parent_result.addError(test, error)
+
+  def addFailure(self, test, error):
+    self.parent_result.addFailure(test, error)
+
+  def addSkip(self, test, reason):
+    pass
+
+
+class ReferenceLeakCheckerTestCase(unittest.TestCase):
+  """A TestCase which runs tests multiple times, collecting reference counts."""
+
+  NB_RUNS = 3
+
+  def run(self, result=None):
+    # python_message.py registers all Message classes to some pickle global
+    # registry, which makes the classes immortal.
+    # We save a copy of this registry, and reset it before we could references.
+    self._saved_pickle_registry = copyreg.dispatch_table.copy()
+
+    # Run the test twice, to warm up the instance attributes.
+    super(ReferenceLeakCheckerTestCase, self).run(result=result)
+    super(ReferenceLeakCheckerTestCase, self).run(result=result)
+
+    oldrefcount = 0
+    local_result = LocalTestResult(result)
+
+    refcount_deltas = []
+    for _ in range(self.NB_RUNS):
+      oldrefcount = self._getRefcounts()
+      super(ReferenceLeakCheckerTestCase, self).run(result=local_result)
+      newrefcount = self._getRefcounts()
+      refcount_deltas.append(newrefcount - oldrefcount)
+    print(refcount_deltas, self)
+
+    try:
+      self.assertEqual(refcount_deltas, [0] * self.NB_RUNS)
+    except Exception:  # pylint: disable=broad-except
+      result.addError(self, sys.exc_info())
+
+  def _getRefcounts(self):
+    copyreg.dispatch_table.clear()
+    copyreg.dispatch_table.update(self._saved_pickle_registry)
+    # It is sometimes necessary to gc.collect() multiple times, to ensure
+    # that all objects can be collected.
+    gc.collect()
+    gc.collect()
+    gc.collect()
+    return sys.gettotalrefcount()
+
+
+if hasattr(sys, 'gettotalrefcount'):
+  BaseTestCase = ReferenceLeakCheckerTestCase
+  SkipReferenceLeakChecker = unittest.skip
+
+else:
+  # When PyDEBUG is not enabled, run the tests normally.
+  BaseTestCase = unittest.TestCase
+
+  def SkipReferenceLeakChecker(reason):
+    del reason  # Don't skip, so don't need a reason.
+    def Same(func):
+      return func
+    return Same
diff --git a/python/google/protobuf/internal/text_encoding_test.py b/python/google/protobuf/internal/text_encoding_test.py
index 338a287..c7d182c 100755
--- a/python/google/protobuf/internal/text_encoding_test.py
+++ b/python/google/protobuf/internal/text_encoding_test.py
@@ -33,9 +33,10 @@
 """Tests for google.protobuf.text_encoding."""
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import text_encoding
 
 TEST_VALUES = [
diff --git a/python/google/protobuf/internal/text_format_test.py b/python/google/protobuf/internal/text_format_test.py
index 0e14556..237a2d5 100755
--- a/python/google/protobuf/internal/text_format_test.py
+++ b/python/google/protobuf/internal/text_format_test.py
@@ -1,4 +1,5 @@
 #! /usr/bin/env python
+# -*- coding: utf-8 -*-
 #
 # Protocol Buffers - Google's data interchange format
 # Copyright 2008 Google Inc.  All rights reserved.
@@ -35,23 +36,29 @@
 __author__ = 'kenton@google.com (Kenton Varda)'
 
 
+import math
 import re
 import six
 import string
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  # PY26, pylint: disable=g-import-not-at-top
 except ImportError:
-  import unittest
+  import unittest  # pylint: disable=g-import-not-at-top
+
 from google.protobuf.internal import _parameterized
 
+from google.protobuf import any_pb2
+from google.protobuf import any_test_pb2
 from google.protobuf import map_unittest_pb2
 from google.protobuf import unittest_mset_pb2
 from google.protobuf import unittest_pb2
 from google.protobuf import unittest_proto3_arena_pb2
 from google.protobuf.internal import api_implementation
-from google.protobuf.internal import test_util
+from google.protobuf.internal import any_test_pb2 as test_extend_any
 from google.protobuf.internal import message_set_extensions_pb2
+from google.protobuf.internal import test_util
+from google.protobuf import descriptor_pool
 from google.protobuf import text_format
 
 
@@ -62,7 +69,7 @@
   # expects single characters.  Therefore it's an error (in addition to being
   # non-sensical in the first place) to try to specify a "quote mark" that is
   # more than one character.
-  def TestQuoteMarksAreSingleChars(self):
+  def testQuoteMarksAreSingleChars(self):
     for quote in text_format._QUOTES:
       self.assertEqual(1, len(quote))
 
@@ -89,13 +96,11 @@
                .replace('e-0','e-').replace('e-0','e-')
     # Floating point fields are printed with .0 suffix even if they are
     # actualy integer numbers.
-    text = re.compile('\.0$', re.MULTILINE).sub('', text)
+    text = re.compile(r'\.0$', re.MULTILINE).sub('', text)
     return text
 
 
-@_parameterized.Parameters(
-    (unittest_pb2),
-    (unittest_proto3_arena_pb2))
+@_parameterized.parameters((unittest_pb2), (unittest_proto3_arena_pb2))
 class TextFormatTest(TextFormatBase):
 
   def testPrintExotic(self, message_module):
@@ -119,8 +124,10 @@
         'repeated_string: "\\303\\274\\352\\234\\237"\n')
 
   def testPrintExoticUnicodeSubclass(self, message_module):
+
     class UnicodeSub(six.text_type):
       pass
+
     message = message_module.TestAllTypes()
     message.repeated_string.append(UnicodeSub(u'\u00fc\ua71f'))
     self.CompareToGoldenText(
@@ -164,8 +171,8 @@
     message.repeated_string.append('\000\001\a\b\f\n\r\t\v\\\'"')
     message.repeated_string.append(u'\u00fc\ua71f')
     self.CompareToGoldenText(
-        self.RemoveRedundantZeros(
-            text_format.MessageToString(message, as_one_line=True)),
+        self.RemoveRedundantZeros(text_format.MessageToString(
+            message, as_one_line=True)),
         'repeated_int64: -9223372036854775808'
         ' repeated_uint64: 18446744073709551615'
         ' repeated_double: 123.456'
@@ -186,21 +193,23 @@
     message.repeated_string.append(u'\u00fc\ua71f')
 
     # Test as_utf8 = False.
-    wire_text = text_format.MessageToString(
-        message, as_one_line=True, as_utf8=False)
+    wire_text = text_format.MessageToString(message,
+                                            as_one_line=True,
+                                            as_utf8=False)
     parsed_message = message_module.TestAllTypes()
     r = text_format.Parse(wire_text, parsed_message)
     self.assertIs(r, parsed_message)
     self.assertEqual(message, parsed_message)
 
     # Test as_utf8 = True.
-    wire_text = text_format.MessageToString(
-        message, as_one_line=True, as_utf8=True)
+    wire_text = text_format.MessageToString(message,
+                                            as_one_line=True,
+                                            as_utf8=True)
     parsed_message = message_module.TestAllTypes()
     r = text_format.Parse(wire_text, parsed_message)
     self.assertIs(r, parsed_message)
     self.assertEqual(message, parsed_message,
-                      '\n%s != %s' % (message, parsed_message))
+                     '\n%s != %s' % (message, parsed_message))
 
   def testPrintRawUtf8String(self, message_module):
     message = message_module.TestAllTypes()
@@ -210,7 +219,7 @@
     parsed_message = message_module.TestAllTypes()
     text_format.Parse(text, parsed_message)
     self.assertEqual(message, parsed_message,
-                      '\n%s != %s' % (message, parsed_message))
+                     '\n%s != %s' % (message, parsed_message))
 
   def testPrintFloatFormat(self, message_module):
     # Check that float_format argument is passed to sub-message formatting.
@@ -231,14 +240,15 @@
     message.payload.repeated_double.append(.000078900)
     formatted_fields = ['optional_float: 1.25',
                         'optional_double: -3.45678901234568e-6',
-                        'repeated_float: -5642',
-                        'repeated_double: 7.89e-5']
+                        'repeated_float: -5642', 'repeated_double: 7.89e-5']
     text_message = text_format.MessageToString(message, float_format='.15g')
     self.CompareToGoldenText(
         self.RemoveRedundantZeros(text_message),
-        'payload {{\n  {0}\n  {1}\n  {2}\n  {3}\n}}\n'.format(*formatted_fields))
+        'payload {{\n  {0}\n  {1}\n  {2}\n  {3}\n}}\n'.format(
+            *formatted_fields))
     # as_one_line=True is a separate code branch where float_format is passed.
-    text_message = text_format.MessageToString(message, as_one_line=True,
+    text_message = text_format.MessageToString(message,
+                                               as_one_line=True,
                                                float_format='.15g')
     self.CompareToGoldenText(
         self.RemoveRedundantZeros(text_message),
@@ -249,6 +259,36 @@
     message.c = 123
     self.assertEqual('c: 123\n', str(message))
 
+  def testPrintField(self, message_module):
+    message = message_module.TestAllTypes()
+    field = message.DESCRIPTOR.fields_by_name['optional_float']
+    value = message.optional_float
+    out = text_format.TextWriter(False)
+    text_format.PrintField(field, value, out)
+    self.assertEqual('optional_float: 0.0\n', out.getvalue())
+    out.close()
+    # Test Printer
+    out = text_format.TextWriter(False)
+    printer = text_format._Printer(out)
+    printer.PrintField(field, value)
+    self.assertEqual('optional_float: 0.0\n', out.getvalue())
+    out.close()
+
+  def testPrintFieldValue(self, message_module):
+    message = message_module.TestAllTypes()
+    field = message.DESCRIPTOR.fields_by_name['optional_float']
+    value = message.optional_float
+    out = text_format.TextWriter(False)
+    text_format.PrintFieldValue(field, value, out)
+    self.assertEqual('0.0', out.getvalue())
+    out.close()
+    # Test Printer
+    out = text_format.TextWriter(False)
+    printer = text_format._Printer(out)
+    printer.PrintFieldValue(field, value)
+    self.assertEqual('0.0', out.getvalue())
+    out.close()
+
   def testParseAllFields(self, message_module):
     message = message_module.TestAllTypes()
     test_util.SetAllFields(message)
@@ -260,6 +300,33 @@
     if message_module is unittest_pb2:
       test_util.ExpectAllFieldsSet(self, message)
 
+  def testParseAndMergeUtf8(self, message_module):
+    message = message_module.TestAllTypes()
+    test_util.SetAllFields(message)
+    ascii_text = text_format.MessageToString(message)
+    ascii_text = ascii_text.encode('utf-8')
+
+    parsed_message = message_module.TestAllTypes()
+    text_format.Parse(ascii_text, parsed_message)
+    self.assertEqual(message, parsed_message)
+    if message_module is unittest_pb2:
+      test_util.ExpectAllFieldsSet(self, message)
+
+    parsed_message.Clear()
+    text_format.Merge(ascii_text, parsed_message)
+    self.assertEqual(message, parsed_message)
+    if message_module is unittest_pb2:
+      test_util.ExpectAllFieldsSet(self, message)
+
+    if six.PY2:
+      msg2 = message_module.TestAllTypes()
+      text = (u'optional_string: "café"')
+      text_format.Merge(text, msg2)
+      self.assertEqual(msg2.optional_string, u'café')
+      msg2.Clear()
+      text_format.Parse(text, msg2)
+      self.assertEqual(msg2.optional_string, u'café')
+
   def testParseExotic(self, message_module):
     message = message_module.TestAllTypes()
     text = ('repeated_int64: -9223372036854775808\n'
@@ -280,8 +347,7 @@
     self.assertEqual(123.456, message.repeated_double[0])
     self.assertEqual(1.23e22, message.repeated_double[1])
     self.assertEqual(1.23e-18, message.repeated_double[2])
-    self.assertEqual(
-        '\000\001\a\b\f\n\r\t\v\\\'"', message.repeated_string[0])
+    self.assertEqual('\000\001\a\b\f\n\r\t\v\\\'"', message.repeated_string[0])
     self.assertEqual('foocorgegrault', message.repeated_string[1])
     self.assertEqual(u'\u00fc\ua71f', message.repeated_string[2])
     self.assertEqual(u'\u00fc', message.repeated_string[3])
@@ -304,6 +370,7 @@
   def testParseRepeatedScalarShortFormat(self, message_module):
     message = message_module.TestAllTypes()
     text = ('repeated_int64: [100, 200];\n'
+            'repeated_int64: []\n'
             'repeated_int64: 300,\n'
             'repeated_string: ["one", "two"];\n')
     text_format.Parse(text, message)
@@ -314,6 +381,18 @@
     self.assertEqual(u'one', message.repeated_string[0])
     self.assertEqual(u'two', message.repeated_string[1])
 
+  def testParseRepeatedMessageShortFormat(self, message_module):
+    message = message_module.TestAllTypes()
+    text = ('repeated_nested_message: [{bb: 100}, {bb: 200}],\n'
+            'repeated_nested_message: {bb: 300}\n'
+            'repeated_nested_message [{bb: 400}];\n')
+    text_format.Parse(text, message)
+
+    self.assertEqual(100, message.repeated_nested_message[0].bb)
+    self.assertEqual(200, message.repeated_nested_message[1].bb)
+    self.assertEqual(300, message.repeated_nested_message[2].bb)
+    self.assertEqual(400, message.repeated_nested_message[3].bb)
+
   def testParseEmptyText(self, message_module):
     message = message_module.TestAllTypes()
     text = ''
@@ -323,50 +402,39 @@
   def testParseInvalidUtf8(self, message_module):
     message = message_module.TestAllTypes()
     text = 'repeated_string: "\\xc3\\xc3"'
-    self.assertRaises(text_format.ParseError, text_format.Parse, text, message)
+    with self.assertRaises(text_format.ParseError) as e:
+      text_format.Parse(text, message)
+    self.assertEqual(e.exception.GetLine(), 1)
+    self.assertEqual(e.exception.GetColumn(), 28)
 
   def testParseSingleWord(self, message_module):
     message = message_module.TestAllTypes()
     text = 'foo'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        (r'1:1 : Message type "\w+.TestAllTypes" has no field named '
-         r'"foo".'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        r'1:1 : Message type "\w+.TestAllTypes" has no field named '
+        r'"foo".'), text_format.Parse, text, message)
 
   def testParseUnknownField(self, message_module):
     message = message_module.TestAllTypes()
     text = 'unknown_field: 8\n'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        (r'1:1 : Message type "\w+.TestAllTypes" has no field named '
-         r'"unknown_field".'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        r'1:1 : Message type "\w+.TestAllTypes" has no field named '
+        r'"unknown_field".'), text_format.Parse, text, message)
 
   def testParseBadEnumValue(self, message_module):
     message = message_module.TestAllTypes()
     text = 'optional_nested_enum: BARR'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        (r'1:23 : Enum type "\w+.TestAllTypes.NestedEnum" '
-         r'has no value named BARR.'),
-        text_format.Parse, text, message)
-
-    message = message_module.TestAllTypes()
-    text = 'optional_nested_enum: 100'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        (r'1:23 : Enum type "\w+.TestAllTypes.NestedEnum" '
-         r'has no value with number 100.'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError,
+                          (r'1:23 : Enum type "\w+.TestAllTypes.NestedEnum" '
+                           r'has no value named BARR.'), text_format.Parse,
+                          text, message)
 
   def testParseBadIntValue(self, message_module):
     message = message_module.TestAllTypes()
     text = 'optional_int32: bork'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        ('1:17 : Couldn\'t parse integer: bork'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError,
+                          ('1:17 : Couldn\'t parse integer: bork'),
+                          text_format.Parse, text, message)
 
   def testParseStringFieldUnescape(self, message_module):
     message = message_module.TestAllTypes()
@@ -376,6 +444,7 @@
                repeated_string: "\\\\xf\\\\x62"
                repeated_string: "\\\\\xf\\\\\x62"
                repeated_string: "\x5cx20"'''
+
     text_format.Parse(text, message)
 
     SLASH = '\\'
@@ -390,8 +459,7 @@
 
   def testMergeDuplicateScalars(self, message_module):
     message = message_module.TestAllTypes()
-    text = ('optional_int32: 42 '
-            'optional_int32: 67')
+    text = ('optional_int32: 42 ' 'optional_int32: 67')
     r = text_format.Merge(text, message)
     self.assertIs(r, message)
     self.assertEqual(67, message.optional_int32)
@@ -411,6 +479,19 @@
     text_format.Parse(text_format.MessageToString(m), m2)
     self.assertEqual('oneof_uint32', m2.WhichOneof('oneof_field'))
 
+  def testMergeMultipleOneof(self, message_module):
+    m_string = '\n'.join(['oneof_uint32: 11', 'oneof_string: "foo"'])
+    m2 = message_module.TestAllTypes()
+    text_format.Merge(m_string, m2)
+    self.assertEqual('oneof_string', m2.WhichOneof('oneof_field'))
+
+  def testParseMultipleOneof(self, message_module):
+    m_string = '\n'.join(['oneof_uint32: 11', 'oneof_string: "foo"'])
+    m2 = message_module.TestAllTypes()
+    with self.assertRaisesRegexp(text_format.ParseError,
+                                 ' is specified along with field '):
+      text_format.Parse(m_string, m2)
+
 
 # These are tests that aren't fundamentally specific to proto2, but are at
 # the moment because of differences between the proto2 and proto3 test schemas.
@@ -421,12 +502,13 @@
     message = unittest_pb2.TestAllTypes()
     test_util.SetAllFields(message)
     self.CompareToGoldenFile(
-        self.RemoveRedundantZeros(
-            text_format.MessageToString(message, pointy_brackets=True)),
+        self.RemoveRedundantZeros(text_format.MessageToString(
+            message, pointy_brackets=True)),
         'text_format_unittest_data_pointy_oneof.txt')
 
   def testParseGolden(self):
-    golden_text = '\n'.join(self.ReadGolden('text_format_unittest_data.txt'))
+    golden_text = '\n'.join(self.ReadGolden(
+        'text_format_unittest_data_oneof_implemented.txt'))
     parsed_message = unittest_pb2.TestAllTypes()
     r = text_format.Parse(golden_text, parsed_message)
     self.assertIs(r, parsed_message)
@@ -442,34 +524,73 @@
         self.RemoveRedundantZeros(text_format.MessageToString(message)),
         'text_format_unittest_data_oneof_implemented.txt')
 
-  def testPrintAllFieldsPointy(self):
-    message = unittest_pb2.TestAllTypes()
-    test_util.SetAllFields(message)
-    self.CompareToGoldenFile(
-        self.RemoveRedundantZeros(
-            text_format.MessageToString(message, pointy_brackets=True)),
-        'text_format_unittest_data_pointy_oneof.txt')
-
   def testPrintInIndexOrder(self):
     message = unittest_pb2.TestFieldOrderings()
-    message.my_string = '115'
+    # Fields are listed in index order instead of field number.
+    message.my_string = 'str'
     message.my_int = 101
     message.my_float = 111
     message.optional_nested_message.oo = 0
     message.optional_nested_message.bb = 1
+    message.Extensions[unittest_pb2.my_extension_string] = 'ext_str0'
+    # Extensions are listed based on the order of extension number.
+    # Extension number 12.
+    message.Extensions[unittest_pb2.TestExtensionOrderings2.
+                       test_ext_orderings2].my_string = 'ext_str2'
+    # Extension number 13.
+    message.Extensions[unittest_pb2.TestExtensionOrderings1.
+                       test_ext_orderings1].my_string = 'ext_str1'
+    # Extension number 14.
+    message.Extensions[
+        unittest_pb2.TestExtensionOrderings2.TestExtensionOrderings3.
+        test_ext_orderings3].my_string = 'ext_str3'
+
+    # Print in index order.
     self.CompareToGoldenText(
-        self.RemoveRedundantZeros(text_format.MessageToString(
-            message, use_index_order=True)),
-        'my_string: \"115\"\nmy_int: 101\nmy_float: 111\n'
-        'optional_nested_message {\n  oo: 0\n  bb: 1\n}\n')
+        self.RemoveRedundantZeros(
+            text_format.MessageToString(message, use_index_order=True)),
+        'my_string: "str"\n'
+        'my_int: 101\n'
+        'my_float: 111\n'
+        'optional_nested_message {\n'
+        '  oo: 0\n'
+        '  bb: 1\n'
+        '}\n'
+        '[protobuf_unittest.TestExtensionOrderings2.test_ext_orderings2] {\n'
+        '  my_string: "ext_str2"\n'
+        '}\n'
+        '[protobuf_unittest.TestExtensionOrderings1.test_ext_orderings1] {\n'
+        '  my_string: "ext_str1"\n'
+        '}\n'
+        '[protobuf_unittest.TestExtensionOrderings2.TestExtensionOrderings3'
+        '.test_ext_orderings3] {\n'
+        '  my_string: "ext_str3"\n'
+        '}\n'
+        '[protobuf_unittest.my_extension_string]: "ext_str0"\n')
+    # By default, print in field number order.
     self.CompareToGoldenText(
-        self.RemoveRedundantZeros(text_format.MessageToString(
-            message)),
-        'my_int: 101\nmy_string: \"115\"\nmy_float: 111\n'
-        'optional_nested_message {\n  bb: 1\n  oo: 0\n}\n')
+        self.RemoveRedundantZeros(text_format.MessageToString(message)),
+        'my_int: 101\n'
+        'my_string: "str"\n'
+        '[protobuf_unittest.TestExtensionOrderings2.test_ext_orderings2] {\n'
+        '  my_string: "ext_str2"\n'
+        '}\n'
+        '[protobuf_unittest.TestExtensionOrderings1.test_ext_orderings1] {\n'
+        '  my_string: "ext_str1"\n'
+        '}\n'
+        '[protobuf_unittest.TestExtensionOrderings2.TestExtensionOrderings3'
+        '.test_ext_orderings3] {\n'
+        '  my_string: "ext_str3"\n'
+        '}\n'
+        '[protobuf_unittest.my_extension_string]: "ext_str0"\n'
+        'my_float: 111\n'
+        'optional_nested_message {\n'
+        '  bb: 1\n'
+        '  oo: 0\n'
+        '}\n')
 
   def testMergeLinesGolden(self):
-    opened = self.ReadGolden('text_format_unittest_data.txt')
+    opened = self.ReadGolden('text_format_unittest_data_oneof_implemented.txt')
     parsed_message = unittest_pb2.TestAllTypes()
     r = text_format.MergeLines(opened, parsed_message)
     self.assertIs(r, parsed_message)
@@ -479,7 +600,7 @@
     self.assertEqual(message, parsed_message)
 
   def testParseLinesGolden(self):
-    opened = self.ReadGolden('text_format_unittest_data.txt')
+    opened = self.ReadGolden('text_format_unittest_data_oneof_implemented.txt')
     parsed_message = unittest_pb2.TestAllTypes()
     r = text_format.ParseLines(opened, parsed_message)
     self.assertIs(r, parsed_message)
@@ -495,14 +616,13 @@
     message.map_int64_int64[-2**33] = -2**34
     message.map_uint32_uint32[123] = 456
     message.map_uint64_uint64[2**33] = 2**34
-    message.map_string_string["abc"] = "123"
+    message.map_string_string['abc'] = '123'
     message.map_int32_foreign_message[111].c = 5
 
     # Maps are serialized to text format using their underlying repeated
     # representation.
     self.CompareToGoldenText(
-        text_format.MessageToString(message),
-        'map_int32_int32 {\n'
+        text_format.MessageToString(message), 'map_int32_int32 {\n'
         '  key: -123\n'
         '  value: -456\n'
         '}\n'
@@ -535,29 +655,24 @@
       message.map_string_string[letter] = 'dummy'
     for letter in reversed(string.ascii_uppercase[0:13]):
       message.map_string_string[letter] = 'dummy'
-    golden = ''.join((
-        'map_string_string {\n  key: "%c"\n  value: "dummy"\n}\n' % (letter,)
-        for letter in string.ascii_uppercase))
+    golden = ''.join(('map_string_string {\n  key: "%c"\n  value: "dummy"\n}\n'
+                      % (letter,) for letter in string.ascii_uppercase))
     self.CompareToGoldenText(text_format.MessageToString(message), golden)
 
-  def testMapOrderSemantics(self):
-    golden_lines = self.ReadGolden('map_test_data.txt')
-    # The C++ implementation emits defaulted-value fields, while the Python
-    # implementation does not.  Adjusting for this is awkward, but it is
-    # valuable to test against a common golden file.
-    line_blacklist = ('  key: 0\n',
-                      '  value: 0\n',
-                      '  key: false\n',
-                      '  value: false\n')
-    golden_lines = [line for line in golden_lines if line not in line_blacklist]
+  # TODO(teboring): In c/137553523, not serializing default value for map entry
+  # message has been fixed. This test needs to be disabled in order to submit
+  # that cl. Add this back when c/137553523 has been submitted.
+  # def testMapOrderSemantics(self):
+  #   golden_lines = self.ReadGolden('map_test_data.txt')
 
-    message = map_unittest_pb2.TestMap()
-    text_format.ParseLines(golden_lines, message)
-    candidate = text_format.MessageToString(message)
-    # The Python implementation emits "1.0" for the double value that the C++
-    # implementation emits as "1".
-    candidate = candidate.replace('1.0', '1', 2)
-    self.assertMultiLineEqual(candidate, ''.join(golden_lines))
+  #   message = map_unittest_pb2.TestMap()
+  #   text_format.ParseLines(golden_lines, message)
+  #   candidate = text_format.MessageToString(message)
+  #   # The Python implementation emits "1.0" for the double value that the C++
+  #   # implementation emits as "1".
+  #   candidate = candidate.replace('1.0', '1', 2)
+  #   candidate = candidate.replace('0.0', '0', 2)
+  #   self.assertMultiLineEqual(candidate, ''.join(golden_lines))
 
 
 # Tests of proto2-only features (MessageSet, extensions, etc.).
@@ -570,8 +685,7 @@
     message.message_set.Extensions[ext1].i = 23
     message.message_set.Extensions[ext2].str = 'foo'
     self.CompareToGoldenText(
-        text_format.MessageToString(message),
-        'message_set {\n'
+        text_format.MessageToString(message), 'message_set {\n'
         '  [protobuf_unittest.TestMessageSetExtension1] {\n'
         '    i: 23\n'
         '  }\n'
@@ -589,6 +703,24 @@
         '  text: \"bar\"\n'
         '}\n')
 
+  def testPrintMessageSetByFieldNumber(self):
+    out = text_format.TextWriter(False)
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    message.message_set.Extensions[ext1].i = 23
+    message.message_set.Extensions[ext2].str = 'foo'
+    text_format.PrintMessage(message, out, use_field_number=True)
+    self.CompareToGoldenText(out.getvalue(), '1 {\n'
+                             '  1545008 {\n'
+                             '    15: 23\n'
+                             '  }\n'
+                             '  1547769 {\n'
+                             '    25: \"foo\"\n'
+                             '  }\n'
+                             '}\n')
+    out.close()
+
   def testPrintMessageSetAsOneLine(self):
     message = unittest_mset_pb2.TestMessageSetContainer()
     ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
@@ -608,8 +740,7 @@
 
   def testParseMessageSet(self):
     message = unittest_pb2.TestAllTypes()
-    text = ('repeated_uint64: 1\n'
-            'repeated_uint64: 2\n')
+    text = ('repeated_uint64: 1\n' 'repeated_uint64: 2\n')
     text_format.Parse(text, message)
     self.assertEqual(1, message.repeated_uint64[0])
     self.assertEqual(2, message.repeated_uint64[1])
@@ -629,6 +760,62 @@
     self.assertEqual(23, message.message_set.Extensions[ext1].i)
     self.assertEqual('foo', message.message_set.Extensions[ext2].str)
 
+  def testExtensionInsideAnyMessage(self):
+    message = test_extend_any.TestAny()
+    text = ('value {\n'
+            '  [type.googleapis.com/google.protobuf.internal.TestAny] {\n'
+            '    [google.protobuf.internal.TestAnyExtension1.extension1] {\n'
+            '      i: 10\n'
+            '    }\n'
+            '  }\n'
+            '}\n')
+    text_format.Merge(text, message, descriptor_pool=descriptor_pool.Default())
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, descriptor_pool=descriptor_pool.Default()),
+        text)
+
+  def testParseMessageByFieldNumber(self):
+    message = unittest_pb2.TestAllTypes()
+    text = ('34: 1\n' 'repeated_uint64: 2\n')
+    text_format.Parse(text, message, allow_field_number=True)
+    self.assertEqual(1, message.repeated_uint64[0])
+    self.assertEqual(2, message.repeated_uint64[1])
+
+    message = unittest_mset_pb2.TestMessageSetContainer()
+    text = ('1 {\n'
+            '  1545008 {\n'
+            '    15: 23\n'
+            '  }\n'
+            '  1547769 {\n'
+            '    25: \"foo\"\n'
+            '  }\n'
+            '}\n')
+    text_format.Parse(text, message, allow_field_number=True)
+    ext1 = unittest_mset_pb2.TestMessageSetExtension1.message_set_extension
+    ext2 = unittest_mset_pb2.TestMessageSetExtension2.message_set_extension
+    self.assertEqual(23, message.message_set.Extensions[ext1].i)
+    self.assertEqual('foo', message.message_set.Extensions[ext2].str)
+
+    # Can't parse field number without set allow_field_number=True.
+    message = unittest_pb2.TestAllTypes()
+    text = '34:1\n'
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        r'1:1 : Message type "\w+.TestAllTypes" has no field named '
+        r'"34".'), text_format.Parse, text, message)
+
+    # Can't parse if field number is not found.
+    text = '1234:1\n'
+    six.assertRaisesRegex(
+        self,
+        text_format.ParseError,
+        (r'1:1 : Message type "\w+.TestAllTypes" has no field named '
+         r'"1234".'),
+        text_format.Parse,
+        text,
+        message,
+        allow_field_number=True)
+
   def testPrintAllExtensions(self):
     message = unittest_pb2.TestAllExtensions()
     test_util.SetAllExtensions(message)
@@ -669,15 +856,17 @@
     text = ('message_set {\n'
             '  [unknown_extension] {\n'
             '    i: 23\n'
+            '    bin: "\xe0"'
             '    [nested_unknown_ext]: {\n'
             '      i: 23\n'
+            '      x: x\n'
             '      test: "test_string"\n'
             '      floaty_float: -0.315\n'
             '      num: -inf\n'
             '      multiline_str: "abc"\n'
             '          "def"\n'
             '          "xyz."\n'
-            '      [nested_unknown_ext]: <\n'
+            '      [nested_unknown_ext.ext]: <\n'
             '        i: 23\n'
             '        i: 24\n'
             '        pointfloat: .3\n'
@@ -689,6 +878,10 @@
             '    }\n'
             '  }\n'
             '  [unknown_extension]: 5\n'
+            '  [unknown_extension_with_number_field] {\n'
+            '    1: "some_field"\n'
+            '    2: -0.451\n'
+            '  }\n'
             '}\n')
     text_format.Parse(text, message, allow_unknown_extension=True)
     golden = 'message_set {\n}\n'
@@ -704,7 +897,9 @@
     six.assertRaisesRegex(self,
                           text_format.ParseError,
                           'Invalid field value: }',
-                          text_format.Parse, malformed, message,
+                          text_format.Parse,
+                          malformed,
+                          message,
                           allow_unknown_extension=True)
 
     message = unittest_mset_pb2.TestMessageSetContainer()
@@ -716,7 +911,9 @@
     six.assertRaisesRegex(self,
                           text_format.ParseError,
                           'Invalid field value: "',
-                          text_format.Parse, malformed, message,
+                          text_format.Parse,
+                          malformed,
+                          message,
                           allow_unknown_extension=True)
 
     message = unittest_mset_pb2.TestMessageSetContainer()
@@ -728,7 +925,9 @@
     six.assertRaisesRegex(self,
                           text_format.ParseError,
                           'Invalid field value: "',
-                          text_format.Parse, malformed, message,
+                          text_format.Parse,
+                          malformed,
+                          message,
                           allow_unknown_extension=True)
 
     message = unittest_mset_pb2.TestMessageSetContainer()
@@ -740,24 +939,27 @@
     six.assertRaisesRegex(self,
                           text_format.ParseError,
                           '5:1 : Expected ">".',
-                          text_format.Parse, malformed, message,
+                          text_format.Parse,
+                          malformed,
+                          message,
                           allow_unknown_extension=True)
 
     # Don't allow unknown fields with allow_unknown_extension=True.
     message = unittest_mset_pb2.TestMessageSetContainer()
     malformed = ('message_set {\n'
                  '  unknown_field: true\n'
-                 '  \n'  # Missing '>' here.
                  '}\n')
     six.assertRaisesRegex(self,
                           text_format.ParseError,
                           ('2:3 : Message type '
                            '"proto2_wireformat_unittest.TestMessageSet" has no'
                            ' field named "unknown_field".'),
-                          text_format.Parse, malformed, message,
+                          text_format.Parse,
+                          malformed,
+                          message,
                           allow_unknown_extension=True)
 
-    # Parse known extension correcty.
+    # Parse known extension correctly.
     message = unittest_mset_pb2.TestMessageSetContainer()
     text = ('message_set {\n'
             '  [protobuf_unittest.TestMessageSetExtension1] {\n'
@@ -773,70 +975,87 @@
     self.assertEqual(23, message.message_set.Extensions[ext1].i)
     self.assertEqual('foo', message.message_set.Extensions[ext2].str)
 
+  def testParseBadIdentifier(self):
+    message = unittest_pb2.TestAllTypes()
+    text = ('optional_nested_message { "bb": 1 }')
+    with self.assertRaises(text_format.ParseError) as e:
+      text_format.Parse(text, message)
+    self.assertEqual(str(e.exception),
+                     '1:27 : Expected identifier or number, got "bb".')
+
   def testParseBadExtension(self):
     message = unittest_pb2.TestAllExtensions()
     text = '[unknown_extension]: 8\n'
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        '1:2 : Extension "unknown_extension" not registered.',
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError,
+                          '1:2 : Extension "unknown_extension" not registered.',
+                          text_format.Parse, text, message)
     message = unittest_pb2.TestAllTypes()
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        ('1:2 : Message type "protobuf_unittest.TestAllTypes" does not have '
-         'extensions.'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        '1:2 : Message type "protobuf_unittest.TestAllTypes" does not have '
+        'extensions.'), text_format.Parse, text, message)
+
+  def testParseNumericUnknownEnum(self):
+    message = unittest_pb2.TestAllTypes()
+    text = 'optional_nested_enum: 100'
+    six.assertRaisesRegex(self, text_format.ParseError,
+                          (r'1:23 : Enum type "\w+.TestAllTypes.NestedEnum" '
+                           r'has no value with number 100.'), text_format.Parse,
+                          text, message)
 
   def testMergeDuplicateExtensionScalars(self):
     message = unittest_pb2.TestAllExtensions()
     text = ('[protobuf_unittest.optional_int32_extension]: 42 '
             '[protobuf_unittest.optional_int32_extension]: 67')
     text_format.Merge(text, message)
-    self.assertEqual(
-        67,
-        message.Extensions[unittest_pb2.optional_int32_extension])
+    self.assertEqual(67,
+                     message.Extensions[unittest_pb2.optional_int32_extension])
 
   def testParseDuplicateExtensionScalars(self):
     message = unittest_pb2.TestAllExtensions()
     text = ('[protobuf_unittest.optional_int32_extension]: 42 '
             '[protobuf_unittest.optional_int32_extension]: 67')
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        ('1:96 : Message type "protobuf_unittest.TestAllExtensions" '
-         'should not have multiple '
-         '"protobuf_unittest.optional_int32_extension" extensions.'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        '1:96 : Message type "protobuf_unittest.TestAllExtensions" '
+        'should not have multiple '
+        '"protobuf_unittest.optional_int32_extension" extensions.'),
+                          text_format.Parse, text, message)
 
-  def testParseDuplicateNestedMessageScalars(self):
+  def testParseDuplicateMessages(self):
     message = unittest_pb2.TestAllTypes()
     text = ('optional_nested_message { bb: 1 } '
             'optional_nested_message { bb: 2 }')
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        ('1:65 : Message type "protobuf_unittest.TestAllTypes.NestedMessage" '
-         'should not have multiple "bb" fields.'),
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        '1:59 : Message type "protobuf_unittest.TestAllTypes" '
+        'should not have multiple "optional_nested_message" fields.'),
+                          text_format.Parse, text,
+                          message)
+
+  def testParseDuplicateExtensionMessages(self):
+    message = unittest_pb2.TestAllExtensions()
+    text = ('[protobuf_unittest.optional_nested_message_extension]: {} '
+            '[protobuf_unittest.optional_nested_message_extension]: {}')
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        '1:114 : Message type "protobuf_unittest.TestAllExtensions" '
+        'should not have multiple '
+        '"protobuf_unittest.optional_nested_message_extension" extensions.'),
+                          text_format.Parse, text, message)
 
   def testParseDuplicateScalars(self):
     message = unittest_pb2.TestAllTypes()
-    text = ('optional_int32: 42 '
-            'optional_int32: 67')
-    six.assertRaisesRegex(self,
-        text_format.ParseError,
-        ('1:36 : Message type "protobuf_unittest.TestAllTypes" should not '
-         'have multiple "optional_int32" fields.'),
-        text_format.Parse, text, message)
+    text = ('optional_int32: 42 ' 'optional_int32: 67')
+    six.assertRaisesRegex(self, text_format.ParseError, (
+        '1:36 : Message type "protobuf_unittest.TestAllTypes" should not '
+        'have multiple "optional_int32" fields.'), text_format.Parse, text,
+                          message)
 
   def testParseGroupNotClosed(self):
     message = unittest_pb2.TestAllTypes()
     text = 'RepeatedGroup: <'
-    six.assertRaisesRegex(self,
-        text_format.ParseError, '1:16 : Expected ">".',
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, '1:16 : Expected ">".',
+                          text_format.Parse, text, message)
     text = 'RepeatedGroup: {'
-    six.assertRaisesRegex(self,
-        text_format.ParseError, '1:16 : Expected "}".',
-        text_format.Parse, text, message)
+    six.assertRaisesRegex(self, text_format.ParseError, '1:16 : Expected "}".',
+                          text_format.Parse, text, message)
 
   def testParseEmptyGroup(self):
     message = unittest_pb2.TestAllTypes()
@@ -887,10 +1106,212 @@
     self.assertEqual(-2**34, message.map_int64_int64[-2**33])
     self.assertEqual(456, message.map_uint32_uint32[123])
     self.assertEqual(2**34, message.map_uint64_uint64[2**33])
-    self.assertEqual("123", message.map_string_string["abc"])
+    self.assertEqual('123', message.map_string_string['abc'])
     self.assertEqual(5, message.map_int32_foreign_message[111].c)
 
 
+class Proto3Tests(unittest.TestCase):
+
+  def testPrintMessageExpandAny(self):
+    packed_message = unittest_pb2.OneString()
+    packed_message.data = 'string'
+    message = any_test_pb2.TestAny()
+    message.any_value.Pack(packed_message)
+    self.assertEqual(
+        text_format.MessageToString(message,
+                                    descriptor_pool=descriptor_pool.Default()),
+        'any_value {\n'
+        '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+        '    data: "string"\n'
+        '  }\n'
+        '}\n')
+
+  def testTopAnyMessage(self):
+    packed_msg = unittest_pb2.OneString()
+    msg = any_pb2.Any()
+    msg.Pack(packed_msg)
+    text = text_format.MessageToString(msg)
+    other_msg = text_format.Parse(text, any_pb2.Any())
+    self.assertEqual(msg, other_msg)
+
+  def testPrintMessageExpandAnyRepeated(self):
+    packed_message = unittest_pb2.OneString()
+    message = any_test_pb2.TestAny()
+    packed_message.data = 'string0'
+    message.repeated_any_value.add().Pack(packed_message)
+    packed_message.data = 'string1'
+    message.repeated_any_value.add().Pack(packed_message)
+    self.assertEqual(
+        text_format.MessageToString(message),
+        'repeated_any_value {\n'
+        '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+        '    data: "string0"\n'
+        '  }\n'
+        '}\n'
+        'repeated_any_value {\n'
+        '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+        '    data: "string1"\n'
+        '  }\n'
+        '}\n')
+
+  def testPrintMessageExpandAnyDescriptorPoolMissingType(self):
+    packed_message = unittest_pb2.OneString()
+    packed_message.data = 'string'
+    message = any_test_pb2.TestAny()
+    message.any_value.Pack(packed_message)
+    empty_pool = descriptor_pool.DescriptorPool()
+    self.assertEqual(
+        text_format.MessageToString(message, descriptor_pool=empty_pool),
+        'any_value {\n'
+        '  type_url: "type.googleapis.com/protobuf_unittest.OneString"\n'
+        '  value: "\\n\\006string"\n'
+        '}\n')
+
+  def testPrintMessageExpandAnyPointyBrackets(self):
+    packed_message = unittest_pb2.OneString()
+    packed_message.data = 'string'
+    message = any_test_pb2.TestAny()
+    message.any_value.Pack(packed_message)
+    self.assertEqual(
+        text_format.MessageToString(message,
+                                    pointy_brackets=True),
+        'any_value <\n'
+        '  [type.googleapis.com/protobuf_unittest.OneString] <\n'
+        '    data: "string"\n'
+        '  >\n'
+        '>\n')
+
+  def testPrintMessageExpandAnyAsOneLine(self):
+    packed_message = unittest_pb2.OneString()
+    packed_message.data = 'string'
+    message = any_test_pb2.TestAny()
+    message.any_value.Pack(packed_message)
+    self.assertEqual(
+        text_format.MessageToString(message,
+                                    as_one_line=True),
+        'any_value {'
+        ' [type.googleapis.com/protobuf_unittest.OneString]'
+        ' { data: "string" } '
+        '}')
+
+  def testPrintMessageExpandAnyAsOneLinePointyBrackets(self):
+    packed_message = unittest_pb2.OneString()
+    packed_message.data = 'string'
+    message = any_test_pb2.TestAny()
+    message.any_value.Pack(packed_message)
+    self.assertEqual(
+        text_format.MessageToString(message,
+                                    as_one_line=True,
+                                    pointy_brackets=True,
+                                    descriptor_pool=descriptor_pool.Default()),
+        'any_value <'
+        ' [type.googleapis.com/protobuf_unittest.OneString]'
+        ' < data: "string" > '
+        '>')
+
+  def testUnknownEnums(self):
+    message = unittest_proto3_arena_pb2.TestAllTypes()
+    message2 = unittest_proto3_arena_pb2.TestAllTypes()
+    message.optional_nested_enum = 999
+    text_string = text_format.MessageToString(message)
+    text_format.Parse(text_string, message2)
+    self.assertEqual(999, message2.optional_nested_enum)
+
+  def testMergeExpandedAny(self):
+    message = any_test_pb2.TestAny()
+    text = ('any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+            '    data: "string"\n'
+            '  }\n'
+            '}\n')
+    text_format.Merge(text, message)
+    packed_message = unittest_pb2.OneString()
+    message.any_value.Unpack(packed_message)
+    self.assertEqual('string', packed_message.data)
+    message.Clear()
+    text_format.Parse(text, message)
+    packed_message = unittest_pb2.OneString()
+    message.any_value.Unpack(packed_message)
+    self.assertEqual('string', packed_message.data)
+
+  def testMergeExpandedAnyRepeated(self):
+    message = any_test_pb2.TestAny()
+    text = ('repeated_any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+            '    data: "string0"\n'
+            '  }\n'
+            '}\n'
+            'repeated_any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+            '    data: "string1"\n'
+            '  }\n'
+            '}\n')
+    text_format.Merge(text, message)
+    packed_message = unittest_pb2.OneString()
+    message.repeated_any_value[0].Unpack(packed_message)
+    self.assertEqual('string0', packed_message.data)
+    message.repeated_any_value[1].Unpack(packed_message)
+    self.assertEqual('string1', packed_message.data)
+
+  def testMergeExpandedAnyPointyBrackets(self):
+    message = any_test_pb2.TestAny()
+    text = ('any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] <\n'
+            '    data: "string"\n'
+            '  >\n'
+            '}\n')
+    text_format.Merge(text, message)
+    packed_message = unittest_pb2.OneString()
+    message.any_value.Unpack(packed_message)
+    self.assertEqual('string', packed_message.data)
+
+  def testMergeAlternativeUrl(self):
+    message = any_test_pb2.TestAny()
+    text = ('any_value {\n'
+            '  [type.otherapi.com/protobuf_unittest.OneString] {\n'
+            '    data: "string"\n'
+            '  }\n'
+            '}\n')
+    text_format.Merge(text, message)
+    packed_message = unittest_pb2.OneString()
+    self.assertEqual('type.otherapi.com/protobuf_unittest.OneString',
+                     message.any_value.type_url)
+
+  def testMergeExpandedAnyDescriptorPoolMissingType(self):
+    message = any_test_pb2.TestAny()
+    text = ('any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+            '    data: "string"\n'
+            '  }\n'
+            '}\n')
+    with self.assertRaises(text_format.ParseError) as e:
+      empty_pool = descriptor_pool.DescriptorPool()
+      text_format.Merge(text, message, descriptor_pool=empty_pool)
+    self.assertEqual(
+        str(e.exception),
+        'Type protobuf_unittest.OneString not found in descriptor pool')
+
+  def testMergeUnexpandedAny(self):
+    text = ('any_value {\n'
+            '  type_url: "type.googleapis.com/protobuf_unittest.OneString"\n'
+            '  value: "\\n\\006string"\n'
+            '}\n')
+    message = any_test_pb2.TestAny()
+    text_format.Merge(text, message)
+    packed_message = unittest_pb2.OneString()
+    message.any_value.Unpack(packed_message)
+    self.assertEqual('string', packed_message.data)
+
+  def testMergeMissingAnyEndToken(self):
+    message = any_test_pb2.TestAny()
+    text = ('any_value {\n'
+            '  [type.googleapis.com/protobuf_unittest.OneString] {\n'
+            '    data: "string"\n')
+    with self.assertRaises(text_format.ParseError) as e:
+      text_format.Merge(text, message)
+    self.assertEqual(str(e.exception), '3:11 : Expected "}".')
+
+
 class TokenizerTest(unittest.TestCase):
 
   def testSimpleTokenCases(self):
@@ -900,140 +1321,336 @@
             'ID7 : "aa\\"bb"\n\n\n\n ID8: {A:inf B:-inf C:true D:false}\n'
             'ID9: 22 ID10: -111111111111111111 ID11: -22\n'
             'ID12: 2222222222222222222 ID13: 1.23456f ID14: 1.2e+2f '
-            'false_bool:  0 true_BOOL:t \n true_bool1:  1 false_BOOL1:f ')
-    tokenizer = text_format._Tokenizer(text.splitlines())
-    methods = [(tokenizer.ConsumeIdentifier, 'identifier1'),
-               ':',
+            'false_bool:  0 true_BOOL:t \n true_bool1:  1 false_BOOL1:f '
+            'False_bool: False True_bool: True X:iNf Y:-inF Z:nAN')
+    tokenizer = text_format.Tokenizer(text.splitlines())
+    methods = [(tokenizer.ConsumeIdentifier, 'identifier1'), ':',
                (tokenizer.ConsumeString, 'string1'),
-               (tokenizer.ConsumeIdentifier, 'identifier2'),
-               ':',
-               (tokenizer.ConsumeInt32, 123),
-               (tokenizer.ConsumeIdentifier, 'identifier3'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'identifier2'), ':',
+               (tokenizer.ConsumeInteger, 123),
+               (tokenizer.ConsumeIdentifier, 'identifier3'), ':',
                (tokenizer.ConsumeString, 'string'),
-               (tokenizer.ConsumeIdentifier, 'identifiER_4'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'identifiER_4'), ':',
                (tokenizer.ConsumeFloat, 1.1e+2),
-               (tokenizer.ConsumeIdentifier, 'ID5'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'ID5'), ':',
                (tokenizer.ConsumeFloat, -0.23),
-               (tokenizer.ConsumeIdentifier, 'ID6'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'ID6'), ':',
                (tokenizer.ConsumeString, 'aaaa\'bbbb'),
-               (tokenizer.ConsumeIdentifier, 'ID7'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'ID7'), ':',
                (tokenizer.ConsumeString, 'aa\"bb'),
-               (tokenizer.ConsumeIdentifier, 'ID8'),
-               ':',
-               '{',
-               (tokenizer.ConsumeIdentifier, 'A'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'ID8'), ':', '{',
+               (tokenizer.ConsumeIdentifier, 'A'), ':',
                (tokenizer.ConsumeFloat, float('inf')),
-               (tokenizer.ConsumeIdentifier, 'B'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'B'), ':',
                (tokenizer.ConsumeFloat, -float('inf')),
-               (tokenizer.ConsumeIdentifier, 'C'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'C'), ':',
                (tokenizer.ConsumeBool, True),
-               (tokenizer.ConsumeIdentifier, 'D'),
-               ':',
-               (tokenizer.ConsumeBool, False),
-               '}',
-               (tokenizer.ConsumeIdentifier, 'ID9'),
-               ':',
-               (tokenizer.ConsumeUint32, 22),
-               (tokenizer.ConsumeIdentifier, 'ID10'),
-               ':',
-               (tokenizer.ConsumeInt64, -111111111111111111),
-               (tokenizer.ConsumeIdentifier, 'ID11'),
-               ':',
-               (tokenizer.ConsumeInt32, -22),
-               (tokenizer.ConsumeIdentifier, 'ID12'),
-               ':',
-               (tokenizer.ConsumeUint64, 2222222222222222222),
-               (tokenizer.ConsumeIdentifier, 'ID13'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'D'), ':',
+               (tokenizer.ConsumeBool, False), '}',
+               (tokenizer.ConsumeIdentifier, 'ID9'), ':',
+               (tokenizer.ConsumeInteger, 22),
+               (tokenizer.ConsumeIdentifier, 'ID10'), ':',
+               (tokenizer.ConsumeInteger, -111111111111111111),
+               (tokenizer.ConsumeIdentifier, 'ID11'), ':',
+               (tokenizer.ConsumeInteger, -22),
+               (tokenizer.ConsumeIdentifier, 'ID12'), ':',
+               (tokenizer.ConsumeInteger, 2222222222222222222),
+               (tokenizer.ConsumeIdentifier, 'ID13'), ':',
                (tokenizer.ConsumeFloat, 1.23456),
-               (tokenizer.ConsumeIdentifier, 'ID14'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'ID14'), ':',
                (tokenizer.ConsumeFloat, 1.2e+2),
-               (tokenizer.ConsumeIdentifier, 'false_bool'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'false_bool'), ':',
                (tokenizer.ConsumeBool, False),
-               (tokenizer.ConsumeIdentifier, 'true_BOOL'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'true_BOOL'), ':',
                (tokenizer.ConsumeBool, True),
-               (tokenizer.ConsumeIdentifier, 'true_bool1'),
-               ':',
+               (tokenizer.ConsumeIdentifier, 'true_bool1'), ':',
                (tokenizer.ConsumeBool, True),
-               (tokenizer.ConsumeIdentifier, 'false_BOOL1'),
-               ':',
-               (tokenizer.ConsumeBool, False)]
+               (tokenizer.ConsumeIdentifier, 'false_BOOL1'), ':',
+               (tokenizer.ConsumeBool, False),
+               (tokenizer.ConsumeIdentifier, 'False_bool'), ':',
+               (tokenizer.ConsumeBool, False),
+               (tokenizer.ConsumeIdentifier, 'True_bool'), ':',
+               (tokenizer.ConsumeBool, True),
+               (tokenizer.ConsumeIdentifier, 'X'), ':',
+               (tokenizer.ConsumeFloat, float('inf')),
+               (tokenizer.ConsumeIdentifier, 'Y'), ':',
+               (tokenizer.ConsumeFloat, float('-inf')),
+               (tokenizer.ConsumeIdentifier, 'Z'), ':',
+               (tokenizer.ConsumeFloat, float('nan'))]
 
     i = 0
     while not tokenizer.AtEnd():
       m = methods[i]
-      if type(m) == str:
+      if isinstance(m, str):
         token = tokenizer.token
         self.assertEqual(token, m)
         tokenizer.NextToken()
+      elif isinstance(m[1], float) and math.isnan(m[1]):
+        self.assertTrue(math.isnan(m[0]()))
       else:
         self.assertEqual(m[1], m[0]())
       i += 1
 
+  def testConsumeAbstractIntegers(self):
+    # This test only tests the failures in the integer parsing methods as well
+    # as the '0' special cases.
+    int64_max = (1 << 63) - 1
+    uint32_max = (1 << 32) - 1
+    text = '-1 %d %d' % (uint32_max + 1, int64_max + 1)
+    tokenizer = text_format.Tokenizer(text.splitlines())
+    self.assertEqual(-1, tokenizer.ConsumeInteger())
+
+    self.assertEqual(uint32_max + 1, tokenizer.ConsumeInteger())
+
+    self.assertEqual(int64_max + 1, tokenizer.ConsumeInteger())
+    self.assertTrue(tokenizer.AtEnd())
+
+    text = '-0 0 0 1.2'
+    tokenizer = text_format.Tokenizer(text.splitlines())
+    self.assertEqual(0, tokenizer.ConsumeInteger())
+    self.assertEqual(0, tokenizer.ConsumeInteger())
+    self.assertEqual(True, tokenizer.TryConsumeInteger())
+    self.assertEqual(False, tokenizer.TryConsumeInteger())
+    with self.assertRaises(text_format.ParseError):
+      tokenizer.ConsumeInteger()
+    self.assertEqual(1.2, tokenizer.ConsumeFloat())
+    self.assertTrue(tokenizer.AtEnd())
+
   def testConsumeIntegers(self):
     # This test only tests the failures in the integer parsing methods as well
     # as the '0' special cases.
     int64_max = (1 << 63) - 1
     uint32_max = (1 << 32) - 1
     text = '-1 %d %d' % (uint32_max + 1, int64_max + 1)
-    tokenizer = text_format._Tokenizer(text.splitlines())
-    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint32)
-    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint64)
-    self.assertEqual(-1, tokenizer.ConsumeInt32())
+    tokenizer = text_format.Tokenizer(text.splitlines())
+    self.assertRaises(text_format.ParseError,
+                      text_format._ConsumeUint32, tokenizer)
+    self.assertRaises(text_format.ParseError,
+                      text_format._ConsumeUint64, tokenizer)
+    self.assertEqual(-1, text_format._ConsumeInt32(tokenizer))
 
-    self.assertRaises(text_format.ParseError, tokenizer.ConsumeUint32)
-    self.assertRaises(text_format.ParseError, tokenizer.ConsumeInt32)
-    self.assertEqual(uint32_max + 1, tokenizer.ConsumeInt64())
+    self.assertRaises(text_format.ParseError,
+                      text_format._ConsumeUint32, tokenizer)
+    self.assertRaises(text_format.ParseError,
+                      text_format._ConsumeInt32, tokenizer)
+    self.assertEqual(uint32_max + 1, text_format._ConsumeInt64(tokenizer))
 
-    self.assertRaises(text_format.ParseError, tokenizer.ConsumeInt64)
-    self.assertEqual(int64_max + 1, tokenizer.ConsumeUint64())
+    self.assertRaises(text_format.ParseError,
+                      text_format._ConsumeInt64, tokenizer)
+    self.assertEqual(int64_max + 1, text_format._ConsumeUint64(tokenizer))
     self.assertTrue(tokenizer.AtEnd())
 
     text = '-0 -0 0 0'
-    tokenizer = text_format._Tokenizer(text.splitlines())
-    self.assertEqual(0, tokenizer.ConsumeUint32())
-    self.assertEqual(0, tokenizer.ConsumeUint64())
-    self.assertEqual(0, tokenizer.ConsumeUint32())
-    self.assertEqual(0, tokenizer.ConsumeUint64())
+    tokenizer = text_format.Tokenizer(text.splitlines())
+    self.assertEqual(0, text_format._ConsumeUint32(tokenizer))
+    self.assertEqual(0, text_format._ConsumeUint64(tokenizer))
+    self.assertEqual(0, text_format._ConsumeUint32(tokenizer))
+    self.assertEqual(0, text_format._ConsumeUint64(tokenizer))
     self.assertTrue(tokenizer.AtEnd())
 
   def testConsumeByteString(self):
     text = '"string1\''
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
 
     text = 'string1"'
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
 
     text = '\n"\\xt"'
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
 
     text = '\n"\\"'
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
 
     text = '\n"\\x"'
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeByteString)
 
   def testConsumeBool(self):
     text = 'not-a-bool'
-    tokenizer = text_format._Tokenizer(text.splitlines())
+    tokenizer = text_format.Tokenizer(text.splitlines())
     self.assertRaises(text_format.ParseError, tokenizer.ConsumeBool)
 
+  def testSkipComment(self):
+    tokenizer = text_format.Tokenizer('# some comment'.splitlines())
+    self.assertTrue(tokenizer.AtEnd())
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeComment)
+
+  def testConsumeComment(self):
+    tokenizer = text_format.Tokenizer('# some comment'.splitlines(),
+                                      skip_comments=False)
+    self.assertFalse(tokenizer.AtEnd())
+    self.assertEqual('# some comment', tokenizer.ConsumeComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testConsumeTwoComments(self):
+    text = '# some comment\n# another comment'
+    tokenizer = text_format.Tokenizer(text.splitlines(), skip_comments=False)
+    self.assertEqual('# some comment', tokenizer.ConsumeComment())
+    self.assertFalse(tokenizer.AtEnd())
+    self.assertEqual('# another comment', tokenizer.ConsumeComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testConsumeTrailingComment(self):
+    text = 'some_number: 4\n# some comment'
+    tokenizer = text_format.Tokenizer(text.splitlines(), skip_comments=False)
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeComment)
+
+    self.assertEqual('some_number', tokenizer.ConsumeIdentifier())
+    self.assertEqual(tokenizer.token, ':')
+    tokenizer.NextToken()
+    self.assertRaises(text_format.ParseError, tokenizer.ConsumeComment)
+    self.assertEqual(4, tokenizer.ConsumeInteger())
+    self.assertFalse(tokenizer.AtEnd())
+
+    self.assertEqual('# some comment', tokenizer.ConsumeComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testConsumeLineComment(self):
+    tokenizer = text_format.Tokenizer('# some comment'.splitlines(),
+                                      skip_comments=False)
+    self.assertFalse(tokenizer.AtEnd())
+    self.assertEqual((False, '# some comment'),
+                     tokenizer.ConsumeCommentOrTrailingComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testConsumeTwoLineComments(self):
+    text = '# some comment\n# another comment'
+    tokenizer = text_format.Tokenizer(text.splitlines(), skip_comments=False)
+    self.assertEqual((False, '# some comment'),
+                     tokenizer.ConsumeCommentOrTrailingComment())
+    self.assertFalse(tokenizer.AtEnd())
+    self.assertEqual((False, '# another comment'),
+                     tokenizer.ConsumeCommentOrTrailingComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testConsumeAndCheckTrailingComment(self):
+    text = 'some_number: 4  # some comment'  # trailing comment on the same line
+    tokenizer = text_format.Tokenizer(text.splitlines(), skip_comments=False)
+    self.assertRaises(text_format.ParseError,
+                      tokenizer.ConsumeCommentOrTrailingComment)
+
+    self.assertEqual('some_number', tokenizer.ConsumeIdentifier())
+    self.assertEqual(tokenizer.token, ':')
+    tokenizer.NextToken()
+    self.assertRaises(text_format.ParseError,
+                      tokenizer.ConsumeCommentOrTrailingComment)
+    self.assertEqual(4, tokenizer.ConsumeInteger())
+    self.assertFalse(tokenizer.AtEnd())
+
+    self.assertEqual((True, '# some comment'),
+                     tokenizer.ConsumeCommentOrTrailingComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+  def testHashinComment(self):
+    text = 'some_number: 4  # some comment # not a new comment'
+    tokenizer = text_format.Tokenizer(text.splitlines(), skip_comments=False)
+    self.assertEqual('some_number', tokenizer.ConsumeIdentifier())
+    self.assertEqual(tokenizer.token, ':')
+    tokenizer.NextToken()
+    self.assertEqual(4, tokenizer.ConsumeInteger())
+    self.assertEqual((True, '# some comment # not a new comment'),
+                     tokenizer.ConsumeCommentOrTrailingComment())
+    self.assertTrue(tokenizer.AtEnd())
+
+
+# Tests for pretty printer functionality.
+@_parameterized.parameters((unittest_pb2), (unittest_proto3_arena_pb2))
+class PrettyPrinterTest(TextFormatBase):
+
+  def testPrettyPrintNoMatch(self, message_module):
+
+    def printer(message, indent, as_one_line):
+      del message, indent, as_one_line
+      return None
+
+    message = message_module.TestAllTypes()
+    msg = message.repeated_nested_message.add()
+    msg.bb = 42
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=True, message_formatter=printer),
+        'repeated_nested_message { bb: 42 }')
+
+  def testPrettyPrintOneLine(self, message_module):
+
+    def printer(m, indent, as_one_line):
+      del indent, as_one_line
+      if m.DESCRIPTOR == message_module.TestAllTypes.NestedMessage.DESCRIPTOR:
+        return 'My lucky number is %s' % m.bb
+
+    message = message_module.TestAllTypes()
+    msg = message.repeated_nested_message.add()
+    msg.bb = 42
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=True, message_formatter=printer),
+        'repeated_nested_message { My lucky number is 42 }')
+
+  def testPrettyPrintMultiLine(self, message_module):
+
+    def printer(m, indent, as_one_line):
+      if m.DESCRIPTOR == message_module.TestAllTypes.NestedMessage.DESCRIPTOR:
+        line_deliminator = (' ' if as_one_line else '\n') + ' ' * indent
+        return 'My lucky number is:%s%s' % (line_deliminator, m.bb)
+      return None
+
+    message = message_module.TestAllTypes()
+    msg = message.repeated_nested_message.add()
+    msg.bb = 42
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=True, message_formatter=printer),
+        'repeated_nested_message { My lucky number is: 42 }')
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=False, message_formatter=printer),
+        'repeated_nested_message {\n  My lucky number is:\n  42\n}\n')
+
+  def testPrettyPrintEntireMessage(self, message_module):
+
+    def printer(m, indent, as_one_line):
+      del indent, as_one_line
+      if m.DESCRIPTOR == message_module.TestAllTypes.DESCRIPTOR:
+        return 'The is the message!'
+      return None
+
+    message = message_module.TestAllTypes()
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=False, message_formatter=printer),
+        'The is the message!\n')
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=True, message_formatter=printer),
+        'The is the message!')
+
+  def testPrettyPrintMultipleParts(self, message_module):
+
+    def printer(m, indent, as_one_line):
+      del indent, as_one_line
+      if m.DESCRIPTOR == message_module.TestAllTypes.NestedMessage.DESCRIPTOR:
+        return 'My lucky number is %s' % m.bb
+      return None
+
+    message = message_module.TestAllTypes()
+    message.optional_int32 = 61
+    msg = message.repeated_nested_message.add()
+    msg.bb = 42
+    msg = message.repeated_nested_message.add()
+    msg.bb = 99
+    msg = message.optional_nested_message
+    msg.bb = 1
+    self.CompareToGoldenText(
+        text_format.MessageToString(
+            message, as_one_line=True, message_formatter=printer),
+        ('optional_int32: 61 '
+         'optional_nested_message { My lucky number is 1 } '
+         'repeated_nested_message { My lucky number is 42 } '
+         'repeated_nested_message { My lucky number is 99 }'))
 
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/type_checkers.py b/python/google/protobuf/internal/type_checkers.py
index f30ca6a..4a76cd4 100755
--- a/python/google/protobuf/internal/type_checkers.py
+++ b/python/google/protobuf/internal/type_checkers.py
@@ -45,6 +45,7 @@
 
 __author__ = 'robinson@google.com (Will Robinson)'
 
+import numbers
 import six
 
 if six.PY3:
@@ -109,6 +110,16 @@
     return proposed_value
 
 
+class TypeCheckerWithDefault(TypeChecker):
+
+  def __init__(self, default_value, *acceptable_types):
+    TypeChecker.__init__(self, acceptable_types)
+    self._default_value = default_value
+
+  def DefaultValue(self):
+    return self._default_value
+
+
 # IntValueChecker and its subclasses perform integer type-checks
 # and bounds-checks.
 class IntValueChecker(object):
@@ -116,11 +127,11 @@
   """Checker used for integer fields.  Performs type-check and range check."""
 
   def CheckValue(self, proposed_value):
-    if not isinstance(proposed_value, six.integer_types):
+    if not isinstance(proposed_value, numbers.Integral):
       message = ('%.1024r has type %s, but expected one of: %s' %
                  (proposed_value, type(proposed_value), six.integer_types))
       raise TypeError(message)
-    if not self._MIN <= proposed_value <= self._MAX:
+    if not self._MIN <= int(proposed_value) <= self._MAX:
       raise ValueError('Value out of range: %d' % proposed_value)
     # We force 32-bit values to int and 64-bit values to long to make
     # alternate implementations where the distinction is more significant
@@ -140,11 +151,11 @@
     self._enum_type = enum_type
 
   def CheckValue(self, proposed_value):
-    if not isinstance(proposed_value, six.integer_types):
+    if not isinstance(proposed_value, numbers.Integral):
       message = ('%.1024r has type %s, but expected one of: %s' %
                  (proposed_value, type(proposed_value), six.integer_types))
       raise TypeError(message)
-    if proposed_value not in self._enum_type.values_by_number:
+    if int(proposed_value) not in self._enum_type.values_by_number:
       raise ValueError('Unknown enum value: %d' % proposed_value)
     return proposed_value
 
@@ -212,12 +223,13 @@
     _FieldDescriptor.CPPTYPE_INT64: Int64ValueChecker(),
     _FieldDescriptor.CPPTYPE_UINT32: Uint32ValueChecker(),
     _FieldDescriptor.CPPTYPE_UINT64: Uint64ValueChecker(),
-    _FieldDescriptor.CPPTYPE_DOUBLE: TypeChecker(
-        float, int, long),
-    _FieldDescriptor.CPPTYPE_FLOAT: TypeChecker(
-        float, int, long),
-    _FieldDescriptor.CPPTYPE_BOOL: TypeChecker(bool, int),
-    _FieldDescriptor.CPPTYPE_STRING: TypeChecker(bytes),
+    _FieldDescriptor.CPPTYPE_DOUBLE: TypeCheckerWithDefault(
+        0.0, numbers.Real),
+    _FieldDescriptor.CPPTYPE_FLOAT: TypeCheckerWithDefault(
+        0.0, numbers.Real),
+    _FieldDescriptor.CPPTYPE_BOOL: TypeCheckerWithDefault(
+        False, bool, numbers.Integral),
+    _FieldDescriptor.CPPTYPE_STRING: TypeCheckerWithDefault(b'', bytes),
     }
 
 
diff --git a/python/google/protobuf/internal/unknown_fields_test.py b/python/google/protobuf/internal/unknown_fields_test.py
index 9685b8b..8b7de2e 100755
--- a/python/google/protobuf/internal/unknown_fields_test.py
+++ b/python/google/protobuf/internal/unknown_fields_test.py
@@ -36,7 +36,7 @@
 __author__ = 'bohdank@google.com (Bohdan Koval)'
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
 from google.protobuf import unittest_mset_pb2
@@ -47,16 +47,23 @@
 from google.protobuf.internal import message_set_extensions_pb2
 from google.protobuf.internal import missing_enum_values_pb2
 from google.protobuf.internal import test_util
+from google.protobuf.internal import testing_refleaks
 from google.protobuf.internal import type_checkers
 
 
-def SkipIfCppImplementation(func):
+BaseTestCase = testing_refleaks.BaseTestCase
+
+
+# CheckUnknownField() cannot be used by the C++ implementation because
+# some protect members are called. It is not a behavior difference
+# for python and C++ implementation.
+def SkipCheckUnknownFieldIfCppImplementation(func):
   return unittest.skipIf(
       api_implementation.Type() == 'cpp' and api_implementation.Version() == 2,
-      'C++ implementation does not expose unknown fields to Python')(func)
+      'Addtional test for pure python involved protect members')(func)
 
 
-class UnknownFieldsTest(unittest.TestCase):
+class UnknownFieldsTest(BaseTestCase):
 
   def setUp(self):
     self.descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
@@ -73,11 +80,23 @@
     # stdout.
     self.assertTrue(data == self.all_fields_data)
 
-  def testSerializeProto3(self):
-    # Verify that proto3 doesn't preserve unknown fields.
+  def expectSerializeProto3(self, preserve):
     message = unittest_proto3_arena_pb2.TestEmptyMessage()
     message.ParseFromString(self.all_fields_data)
-    self.assertEqual(0, len(message.SerializeToString()))
+    if preserve:
+      self.assertEqual(self.all_fields_data, message.SerializeToString())
+    else:
+      self.assertEqual(0, len(message.SerializeToString()))
+
+  def testSerializeProto3(self):
+    # Verify that proto3 unknown fields behavior.
+    default_preserve = (api_implementation
+                        .GetPythonProto3PreserveUnknownsDefault())
+    self.expectSerializeProto3(default_preserve)
+    api_implementation.SetPythonProto3PreserveUnknownsDefault(
+        not default_preserve)
+    self.expectSerializeProto3(not default_preserve)
+    api_implementation.SetPythonProto3PreserveUnknownsDefault(default_preserve)
 
   def testByteSize(self):
     self.assertEqual(self.all_fields.ByteSize(), self.empty_message.ByteSize())
@@ -119,8 +138,28 @@
     message.ParseFromString(self.all_fields.SerializeToString())
     self.assertNotEqual(self.empty_message, message)
 
+  def testDiscardUnknownFields(self):
+    self.empty_message.DiscardUnknownFields()
+    self.assertEqual(b'', self.empty_message.SerializeToString())
+    # Test message field and repeated message field.
+    message = unittest_pb2.TestAllTypes()
+    other_message = unittest_pb2.TestAllTypes()
+    other_message.optional_string = 'discard'
+    message.optional_nested_message.ParseFromString(
+        other_message.SerializeToString())
+    message.repeated_nested_message.add().ParseFromString(
+        other_message.SerializeToString())
+    self.assertNotEqual(
+        b'', message.optional_nested_message.SerializeToString())
+    self.assertNotEqual(
+        b'', message.repeated_nested_message[0].SerializeToString())
+    message.DiscardUnknownFields()
+    self.assertEqual(b'', message.optional_nested_message.SerializeToString())
+    self.assertEqual(
+        b'', message.repeated_nested_message[0].SerializeToString())
 
-class UnknownFieldsAccessorsTest(unittest.TestCase):
+
+class UnknownFieldsAccessorsTest(BaseTestCase):
 
   def setUp(self):
     self.descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
@@ -129,60 +168,51 @@
     self.all_fields_data = self.all_fields.SerializeToString()
     self.empty_message = unittest_pb2.TestEmptyMessage()
     self.empty_message.ParseFromString(self.all_fields_data)
-    if api_implementation.Type() != 'cpp':
-      # _unknown_fields is an implementation detail.
-      self.unknown_fields = self.empty_message._unknown_fields
 
-  # All the tests that use GetField() check an implementation detail of the
-  # Python implementation, which stores unknown fields as serialized strings.
-  # These tests are skipped by the C++ implementation: it's enough to check that
-  # the message is correctly serialized.
+  # CheckUnknownField() is an additional Pure Python check which checks
+  # a detail of unknown fields. It cannot be used by the C++
+  # implementation because some protect members are called.
+  # The test is added for historical reasons. It is not necessary as
+  # serialized string is checked.
 
-  def GetField(self, name):
+  def CheckUnknownField(self, name, expected_value):
     field_descriptor = self.descriptor.fields_by_name[name]
     wire_type = type_checkers.FIELD_TYPE_TO_WIRE_TYPE[field_descriptor.type]
     field_tag = encoder.TagBytes(field_descriptor.number, wire_type)
     result_dict = {}
-    for tag_bytes, value in self.unknown_fields:
+    for tag_bytes, value in self.empty_message._unknown_fields:
       if tag_bytes == field_tag:
         decoder = unittest_pb2.TestAllTypes._decoders_by_tag[tag_bytes][0]
         decoder(value, 0, len(value), self.all_fields, result_dict)
-    return result_dict[field_descriptor]
+    self.assertEqual(expected_value, result_dict[field_descriptor])
 
-  @SkipIfCppImplementation
-  def testEnum(self):
-    value = self.GetField('optional_nested_enum')
-    self.assertEqual(self.all_fields.optional_nested_enum, value)
+  @SkipCheckUnknownFieldIfCppImplementation
+  def testCheckUnknownFieldValue(self):
+    # Test enum.
+    self.CheckUnknownField('optional_nested_enum',
+                           self.all_fields.optional_nested_enum)
+    # Test repeated enum.
+    self.CheckUnknownField('repeated_nested_enum',
+                           self.all_fields.repeated_nested_enum)
 
-  @SkipIfCppImplementation
-  def testRepeatedEnum(self):
-    value = self.GetField('repeated_nested_enum')
-    self.assertEqual(self.all_fields.repeated_nested_enum, value)
+    # Test varint.
+    self.CheckUnknownField('optional_int32',
+                           self.all_fields.optional_int32)
+    # Test fixed32.
+    self.CheckUnknownField('optional_fixed32',
+                           self.all_fields.optional_fixed32)
 
-  @SkipIfCppImplementation
-  def testVarint(self):
-    value = self.GetField('optional_int32')
-    self.assertEqual(self.all_fields.optional_int32, value)
+    # Test fixed64.
+    self.CheckUnknownField('optional_fixed64',
+                           self.all_fields.optional_fixed64)
 
-  @SkipIfCppImplementation
-  def testFixed32(self):
-    value = self.GetField('optional_fixed32')
-    self.assertEqual(self.all_fields.optional_fixed32, value)
+    # Test lengthd elimited.
+    self.CheckUnknownField('optional_string',
+                           self.all_fields.optional_string)
 
-  @SkipIfCppImplementation
-  def testFixed64(self):
-    value = self.GetField('optional_fixed64')
-    self.assertEqual(self.all_fields.optional_fixed64, value)
-
-  @SkipIfCppImplementation
-  def testLengthDelimited(self):
-    value = self.GetField('optional_string')
-    self.assertEqual(self.all_fields.optional_string, value)
-
-  @SkipIfCppImplementation
-  def testGroup(self):
-    value = self.GetField('optionalgroup')
-    self.assertEqual(self.all_fields.optionalgroup, value)
+    # Test group.
+    self.CheckUnknownField('optionalgroup',
+                           self.all_fields.optionalgroup)
 
   def testCopyFrom(self):
     message = unittest_pb2.TestEmptyMessage()
@@ -221,45 +251,44 @@
     self.assertEqual(message.SerializeToString(), self.all_fields_data)
 
 
-class UnknownEnumValuesTest(unittest.TestCase):
+class UnknownEnumValuesTest(BaseTestCase):
 
   def setUp(self):
     self.descriptor = missing_enum_values_pb2.TestEnumValues.DESCRIPTOR
 
     self.message = missing_enum_values_pb2.TestEnumValues()
+    # TestEnumValues.ZERO = 0, but does not exist in the other NestedEnum.
     self.message.optional_nested_enum = (
-      missing_enum_values_pb2.TestEnumValues.ZERO)
+        missing_enum_values_pb2.TestEnumValues.ZERO)
     self.message.repeated_nested_enum.extend([
-      missing_enum_values_pb2.TestEnumValues.ZERO,
-      missing_enum_values_pb2.TestEnumValues.ONE,
-      ])
+        missing_enum_values_pb2.TestEnumValues.ZERO,
+        missing_enum_values_pb2.TestEnumValues.ONE,
+        ])
     self.message.packed_nested_enum.extend([
-      missing_enum_values_pb2.TestEnumValues.ZERO,
-      missing_enum_values_pb2.TestEnumValues.ONE,
-      ])
+        missing_enum_values_pb2.TestEnumValues.ZERO,
+        missing_enum_values_pb2.TestEnumValues.ONE,
+        ])
     self.message_data = self.message.SerializeToString()
     self.missing_message = missing_enum_values_pb2.TestMissingEnumValues()
     self.missing_message.ParseFromString(self.message_data)
-    if api_implementation.Type() != 'cpp':
-      # _unknown_fields is an implementation detail.
-      self.unknown_fields = self.missing_message._unknown_fields
 
-  # All the tests that use GetField() check an implementation detail of the
-  # Python implementation, which stores unknown fields as serialized strings.
-  # These tests are skipped by the C++ implementation: it's enough to check that
-  # the message is correctly serialized.
+  # CheckUnknownField() is an additional Pure Python check which checks
+  # a detail of unknown fields. It cannot be used by the C++
+  # implementation because some protect members are called.
+  # The test is added for historical reasons. It is not necessary as
+  # serialized string is checked.
 
-  def GetField(self, name):
+  def CheckUnknownField(self, name, expected_value):
     field_descriptor = self.descriptor.fields_by_name[name]
     wire_type = type_checkers.FIELD_TYPE_TO_WIRE_TYPE[field_descriptor.type]
     field_tag = encoder.TagBytes(field_descriptor.number, wire_type)
     result_dict = {}
-    for tag_bytes, value in self.unknown_fields:
+    for tag_bytes, value in self.missing_message._unknown_fields:
       if tag_bytes == field_tag:
         decoder = missing_enum_values_pb2.TestEnumValues._decoders_by_tag[
-          tag_bytes][0]
+            tag_bytes][0]
         decoder(value, 0, len(value), self.message, result_dict)
-    return result_dict[field_descriptor]
+    self.assertEqual(expected_value, result_dict[field_descriptor])
 
   def testUnknownParseMismatchEnumValue(self):
     just_string = missing_enum_values_pb2.JustString()
@@ -274,21 +303,28 @@
     # default value.
     self.assertEqual(missing.optional_nested_enum, 0)
 
-  @SkipIfCppImplementation
   def testUnknownEnumValue(self):
     self.assertFalse(self.missing_message.HasField('optional_nested_enum'))
-    value = self.GetField('optional_nested_enum')
-    self.assertEqual(self.message.optional_nested_enum, value)
+    self.assertEqual(self.missing_message.optional_nested_enum, 2)
+    # Clear does not do anything.
+    serialized = self.missing_message.SerializeToString()
+    self.missing_message.ClearField('optional_nested_enum')
+    self.assertEqual(self.missing_message.SerializeToString(), serialized)
 
-  @SkipIfCppImplementation
   def testUnknownRepeatedEnumValue(self):
-    value = self.GetField('repeated_nested_enum')
-    self.assertEqual(self.message.repeated_nested_enum, value)
+    self.assertEqual([], self.missing_message.repeated_nested_enum)
 
-  @SkipIfCppImplementation
   def testUnknownPackedEnumValue(self):
-    value = self.GetField('packed_nested_enum')
-    self.assertEqual(self.message.packed_nested_enum, value)
+    self.assertEqual([], self.missing_message.packed_nested_enum)
+
+  @SkipCheckUnknownFieldIfCppImplementation
+  def testCheckUnknownFieldValueForEnum(self):
+    self.CheckUnknownField('optional_nested_enum',
+                           self.message.optional_nested_enum)
+    self.CheckUnknownField('repeated_nested_enum',
+                           self.message.repeated_nested_enum)
+    self.CheckUnknownField('packed_nested_enum',
+                           self.message.packed_nested_enum)
 
   def testRoundTrip(self):
     new_message = missing_enum_values_pb2.TestEnumValues()
diff --git a/python/google/protobuf/internal/well_known_types.py b/python/google/protobuf/internal/well_known_types.py
index d35fcc5..37a65cf 100644
--- a/python/google/protobuf/internal/well_known_types.py
+++ b/python/google/protobuf/internal/well_known_types.py
@@ -40,6 +40,7 @@
 
 __author__ = 'jieluo@google.com (Jie Luo)'
 
+import collections
 from datetime import datetime
 from datetime import timedelta
 import six
@@ -53,6 +54,7 @@
 _MILLIS_PER_SECOND = 1000
 _MICROS_PER_SECOND = 1000000
 _SECONDS_PER_DAY = 24 * 3600
+_DURATION_SECONDS_MAX = 315576000000
 
 
 class Error(Exception):
@@ -66,13 +68,14 @@
 class Any(object):
   """Class for Any Message type."""
 
-  def Pack(self, msg, type_url_prefix='type.googleapis.com/'):
+  def Pack(self, msg, type_url_prefix='type.googleapis.com/',
+           deterministic=None):
     """Packs the specified message into current Any message."""
     if len(type_url_prefix) < 1 or type_url_prefix[-1] != '/':
       self.type_url = '%s/%s' % (type_url_prefix, msg.DESCRIPTOR.full_name)
     else:
       self.type_url = '%s%s' % (type_url_prefix, msg.DESCRIPTOR.full_name)
-    self.value = msg.SerializeToString()
+    self.value = msg.SerializeToString(deterministic=deterministic)
 
   def Unpack(self, msg):
     """Unpacks the current Any message into specified message."""
@@ -82,10 +85,14 @@
     msg.ParseFromString(self.value)
     return True
 
+  def TypeName(self):
+    """Returns the protobuf type name of the inner message."""
+    # Only last part is to be used: b/25630112
+    return self.type_url.split('/')[-1]
+
   def Is(self, descriptor):
     """Checks if this Any represents the given protobuf type."""
-    # Only last part is to be used: b/25630112
-    return self.type_url.split('/')[-1] == descriptor.full_name
+    return self.TypeName() == descriptor.full_name
 
 
 class Timestamp(object):
@@ -243,6 +250,7 @@
       represent the exact Duration value. For example: "1s", "1.010s",
       "1.000000100s", "-3.100s"
     """
+    _CheckDurationValid(self.seconds, self.nanos)
     if self.seconds < 0 or self.nanos < 0:
       result = '-'
       seconds = - self.seconds + int((0 - self.nanos) // 1e9)
@@ -282,14 +290,17 @@
     try:
       pos = value.find('.')
       if pos == -1:
-        self.seconds = int(value[:-1])
-        self.nanos = 0
+        seconds = int(value[:-1])
+        nanos = 0
       else:
-        self.seconds = int(value[:pos])
+        seconds = int(value[:pos])
         if value[0] == '-':
-          self.nanos = int(round(float('-0{0}'.format(value[pos: -1])) *1e9))
+          nanos = int(round(float('-0{0}'.format(value[pos: -1])) *1e9))
         else:
-          self.nanos = int(round(float('0{0}'.format(value[pos: -1])) *1e9))
+          nanos = int(round(float('0{0}'.format(value[pos: -1])) *1e9))
+      _CheckDurationValid(seconds, nanos)
+      self.seconds = seconds
+      self.nanos = nanos
     except ValueError:
       raise ParseError(
           'Couldn\'t parse duration: {0}.'.format(value))
@@ -341,12 +352,12 @@
             self.nanos, _NANOS_PER_MICROSECOND))
 
   def FromTimedelta(self, td):
-    """Convertd timedelta to Duration."""
+    """Converts timedelta to Duration."""
     self._NormalizeDuration(td.seconds + td.days * _SECONDS_PER_DAY,
                             td.microseconds * _NANOS_PER_MICROSECOND)
 
   def _NormalizeDuration(self, seconds, nanos):
-    """Set Duration by seconds and nonas."""
+    """Set Duration by seconds and nanos."""
     # Force nanos to be negative if the duration is negative.
     if seconds < 0 and nanos > 0:
       seconds += 1
@@ -355,6 +366,20 @@
     self.nanos = nanos
 
 
+def _CheckDurationValid(seconds, nanos):
+  if seconds < -_DURATION_SECONDS_MAX or seconds > _DURATION_SECONDS_MAX:
+    raise Error(
+        'Duration is not valid: Seconds {0} must be in range '
+        '[-315576000000, 315576000000].'.format(seconds))
+  if nanos <= -_NANOS_PER_SECOND or nanos >= _NANOS_PER_SECOND:
+    raise Error(
+        'Duration is not valid: Nanos {0} must be in range '
+        '[-999999999, 999999999].'.format(nanos))
+  if (nanos < 0 and seconds > 0) or (nanos > 0 and seconds < 0):
+    raise Error(
+        'Duration is not valid: Sign mismatch.')
+
+
 def _RoundTowardZero(value, divider):
   """Truncates the remainder part after division."""
   # For some languanges, the sign of the remainder is implementation
@@ -375,13 +400,16 @@
 
   def ToJsonString(self):
     """Converts FieldMask to string according to proto3 JSON spec."""
-    return ','.join(self.paths)
+    camelcase_paths = []
+    for path in self.paths:
+      camelcase_paths.append(_SnakeCaseToCamelCase(path))
+    return ','.join(camelcase_paths)
 
   def FromJsonString(self, value):
     """Converts string to FieldMask according to proto3 JSON spec."""
     self.Clear()
     for path in value.split(','):
-      self.paths.append(path)
+      self.paths.append(_CamelCaseToSnakeCase(path))
 
   def IsValidForDescriptor(self, message_descriptor):
     """Checks whether the FieldMask is valid for Message Descriptor."""
@@ -450,7 +478,7 @@
   parts = path.split('.')
   last = parts.pop()
   for name in parts:
-    field = message_descriptor.fields_by_name[name]
+    field = message_descriptor.fields_by_name.get(name)
     if (field is None or
         field.label == FieldDescriptor.LABEL_REPEATED or
         field.type != FieldDescriptor.TYPE_MESSAGE):
@@ -468,6 +496,48 @@
         message_descriptor.full_name))
 
 
+def _SnakeCaseToCamelCase(path_name):
+  """Converts a path name from snake_case to camelCase."""
+  result = []
+  after_underscore = False
+  for c in path_name:
+    if c.isupper():
+      raise Error('Fail to print FieldMask to Json string: Path name '
+                  '{0} must not contain uppercase letters.'.format(path_name))
+    if after_underscore:
+      if c.islower():
+        result.append(c.upper())
+        after_underscore = False
+      else:
+        raise Error('Fail to print FieldMask to Json string: The '
+                    'character after a "_" must be a lowercase letter '
+                    'in path name {0}.'.format(path_name))
+    elif c == '_':
+      after_underscore = True
+    else:
+      result += c
+
+  if after_underscore:
+    raise Error('Fail to print FieldMask to Json string: Trailing "_" '
+                'in path name {0}.'.format(path_name))
+  return ''.join(result)
+
+
+def _CamelCaseToSnakeCase(path_name):
+  """Converts a field name from camelCase to snake_case."""
+  result = []
+  for c in path_name:
+    if c == '_':
+      raise ParseError('Fail to parse FieldMask: Path name '
+                       '{0} must not contain "_"s.'.format(path_name))
+    if c.isupper():
+      result += '_'
+      result += c.lower()
+    else:
+      result += c
+  return ''.join(result)
+
+
 class _FieldMaskTree(object):
   """Represents a FieldMask in a tree structure.
 
@@ -582,9 +652,10 @@
         raise ValueError('Error: Field {0} in message {1} is not a singular '
                          'message field and cannot have sub-fields.'.format(
                              name, source_descriptor.full_name))
-      _MergeMessage(
-          child, getattr(source, name), getattr(destination, name),
-          replace_message, replace_repeated)
+      if source.HasField(name):
+        _MergeMessage(
+            child, getattr(source, name), getattr(destination, name),
+            replace_message, replace_repeated)
       continue
     if field.label == FieldDescriptor.LABEL_REPEATED:
       if replace_repeated:
@@ -633,6 +704,12 @@
     struct_value.string_value = value
   elif isinstance(value, _INT_OR_FLOAT):
     struct_value.number_value = value
+  elif isinstance(value, dict):
+    struct_value.struct_value.Clear()
+    struct_value.struct_value.update(value)
+  elif isinstance(value, list):
+    struct_value.list_value.Clear()
+    struct_value.list_value.extend(value)
   else:
     raise ValueError('Unexpected type')
 
@@ -663,18 +740,49 @@
   def __getitem__(self, key):
     return _GetStructValue(self.fields[key])
 
+  def __contains__(self, item):
+    return item in self.fields
+
   def __setitem__(self, key, value):
     _SetStructValue(self.fields[key], value)
 
+  def __delitem__(self, key):
+    del self.fields[key]
+
+  def __len__(self):
+    return len(self.fields)
+
+  def __iter__(self):
+    return iter(self.fields)
+
+  def keys(self):  # pylint: disable=invalid-name
+    return self.fields.keys()
+
+  def values(self):  # pylint: disable=invalid-name
+    return [self[key] for key in self]
+
+  def items(self):  # pylint: disable=invalid-name
+    return [(key, self[key]) for key in self]
+
   def get_or_create_list(self, key):
     """Returns a list for this key, creating if it didn't exist already."""
+    if not self.fields[key].HasField('list_value'):
+      # Clear will mark list_value modified which will indeed create a list.
+      self.fields[key].list_value.Clear()
     return self.fields[key].list_value
 
   def get_or_create_struct(self, key):
     """Returns a struct for this key, creating if it didn't exist already."""
+    if not self.fields[key].HasField('struct_value'):
+      # Clear will mark struct_value modified which will indeed create a struct.
+      self.fields[key].struct_value.Clear()
     return self.fields[key].struct_value
 
-  # TODO(haberman): allow constructing/merging from dict.
+  def update(self, dictionary):  # pylint: disable=invalid-name
+    for key, value in dictionary.items():
+      _SetStructValue(self.fields[key], value)
+
+collections.MutableMapping.register(Struct)
 
 
 class ListValue(object):
@@ -697,17 +805,28 @@
   def __setitem__(self, index, value):
     _SetStructValue(self.values.__getitem__(index), value)
 
+  def __delitem__(self, key):
+    del self.values[key]
+
   def items(self):
     for i in range(len(self)):
       yield self[i]
 
   def add_struct(self):
     """Appends and returns a struct value as the next value in the list."""
-    return self.values.add().struct_value
+    struct_value = self.values.add().struct_value
+    # Clear will mark struct_value modified which will indeed create a struct.
+    struct_value.Clear()
+    return struct_value
 
   def add_list(self):
     """Appends and returns a list value as the next value in the list."""
-    return self.values.add().list_value
+    list_value = self.values.add().list_value
+    # Clear will mark list_value modified which will indeed create a list.
+    list_value.Clear()
+    return list_value
+
+collections.MutableSequence.register(ListValue)
 
 
 WKTBASES = {
diff --git a/python/google/protobuf/internal/well_known_types_test.py b/python/google/protobuf/internal/well_known_types_test.py
index 6acbee2..965940b 100644
--- a/python/google/protobuf/internal/well_known_types_test.py
+++ b/python/google/protobuf/internal/well_known_types_test.py
@@ -34,10 +34,11 @@
 
 __author__ = 'jieluo@google.com (Jie Luo)'
 
+import collections
 from datetime import datetime
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
 
@@ -100,11 +101,15 @@
     message.FromJsonString('1970-01-01T00:00:00.1Z')
     self.assertEqual(0, message.seconds)
     self.assertEqual(100000000, message.nanos)
-    # Parsing accpets offsets.
+    # Parsing accepts offsets.
     message.FromJsonString('1970-01-01T00:00:00-08:00')
     self.assertEqual(8 * 3600, message.seconds)
     self.assertEqual(0, message.nanos)
 
+    # It is not easy to check with current time. For test coverage only.
+    message.GetCurrentTime()
+    self.assertNotEqual(8 * 3600, message.seconds)
+
   def testDurationSerializeAndParse(self):
     message = duration_pb2.Duration()
     # Generated output should contain 3, 6, or 9 fractional digits.
@@ -268,6 +273,17 @@
   def testInvalidTimestamp(self):
     message = timestamp_pb2.Timestamp()
     self.assertRaisesRegexp(
+        well_known_types.ParseError,
+        'Failed to parse timestamp: missing valid timezone offset.',
+        message.FromJsonString,
+        '')
+    self.assertRaisesRegexp(
+        well_known_types.ParseError,
+        'Failed to parse timestamp: invalid trailing data '
+        '1970-01-01T00:00:01Ztrail.',
+        message.FromJsonString,
+        '1970-01-01T00:00:01Ztrail')
+    self.assertRaisesRegexp(
         ValueError,
         'time data \'10000-01-01T00:00:00\' does not match'
         ' format \'%Y-%m-%dT%H:%M:%S\'',
@@ -284,7 +300,7 @@
         '1972-01-01T01:00:00.01+08',)
     self.assertRaisesRegexp(
         ValueError,
-        'year is out of range',
+        'year (0 )?is out of range',
         message.FromJsonString,
         '0000-01-01T00:00:00Z')
     message.seconds = 253402300800
@@ -303,6 +319,38 @@
         well_known_types.ParseError,
         'Couldn\'t parse duration: 1...2s.',
         message.FromJsonString, '1...2s')
+    text = '-315576000001.000000000s'
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        r'Duration is not valid\: Seconds -315576000001 must be in range'
+        r' \[-315576000000\, 315576000000\].',
+        message.FromJsonString, text)
+    text = '315576000001.000000000s'
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        r'Duration is not valid\: Seconds 315576000001 must be in range'
+        r' \[-315576000000\, 315576000000\].',
+        message.FromJsonString, text)
+    message.seconds = -315576000001
+    message.nanos = 0
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        r'Duration is not valid\: Seconds -315576000001 must be in range'
+        r' \[-315576000000\, 315576000000\].',
+        message.ToJsonString)
+    message.seconds = 0
+    message.nanos = 999999999 + 1
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        r'Duration is not valid\: Nanos 1000000000 must be in range'
+        r' \[-999999999\, 999999999\].',
+        message.ToJsonString)
+    message.seconds = -1
+    message.nanos = 1
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        r'Duration is not valid\: Sign mismatch.',
+        message.ToJsonString)
 
 
 class FieldMaskTest(unittest.TestCase):
@@ -322,6 +370,20 @@
     mask.FromJsonString('foo,bar')
     self.assertEqual(['foo', 'bar'], mask.paths)
 
+    # Test camel case
+    mask.Clear()
+    mask.paths.append('foo_bar')
+    self.assertEqual('fooBar', mask.ToJsonString())
+    mask.paths.append('bar_quz')
+    self.assertEqual('fooBar,barQuz', mask.ToJsonString())
+
+    mask.FromJsonString('')
+    self.assertEqual('', mask.ToJsonString())
+    mask.FromJsonString('fooBar')
+    self.assertEqual(['foo_bar'], mask.paths)
+    mask.FromJsonString('fooBar,barQuz')
+    self.assertEqual(['foo_bar', 'bar_quz'], mask.paths)
+
   def testDescriptorToFieldMask(self):
     mask = field_mask_pb2.FieldMask()
     msg_descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
@@ -330,10 +392,37 @@
     self.assertTrue(mask.IsValidForDescriptor(msg_descriptor))
     for field in msg_descriptor.fields:
       self.assertTrue(field.name in mask.paths)
+
+  def testIsValidForDescriptor(self):
+    msg_descriptor = unittest_pb2.TestAllTypes.DESCRIPTOR
+    # Empty mask
+    mask = field_mask_pb2.FieldMask()
+    self.assertTrue(mask.IsValidForDescriptor(msg_descriptor))
+    # All fields from descriptor
+    mask.AllFieldsFromDescriptor(msg_descriptor)
+    self.assertTrue(mask.IsValidForDescriptor(msg_descriptor))
+    # Child under optional message
     mask.paths.append('optional_nested_message.bb')
     self.assertTrue(mask.IsValidForDescriptor(msg_descriptor))
+    # Repeated field is only allowed in the last position of path
     mask.paths.append('repeated_nested_message.bb')
     self.assertFalse(mask.IsValidForDescriptor(msg_descriptor))
+    # Invalid top level field
+    mask = field_mask_pb2.FieldMask()
+    mask.paths.append('xxx')
+    self.assertFalse(mask.IsValidForDescriptor(msg_descriptor))
+    # Invalid field in root
+    mask = field_mask_pb2.FieldMask()
+    mask.paths.append('xxx.zzz')
+    self.assertFalse(mask.IsValidForDescriptor(msg_descriptor))
+    # Invalid field in internal node
+    mask = field_mask_pb2.FieldMask()
+    mask.paths.append('optional_nested_message.xxx.zzz')
+    self.assertFalse(mask.IsValidForDescriptor(msg_descriptor))
+    # Invalid field in leaf
+    mask = field_mask_pb2.FieldMask()
+    mask.paths.append('optional_nested_message.xxx')
+    self.assertFalse(mask.IsValidForDescriptor(msg_descriptor))
 
   def testCanonicalFrom(self):
     mask = field_mask_pb2.FieldMask()
@@ -389,6 +478,9 @@
     mask2.FromJsonString('foo.bar,bar')
     out_mask.Union(mask1, mask2)
     self.assertEqual('bar,foo.bar,quz', out_mask.ToJsonString())
+    src = unittest_pb2.TestAllTypes()
+    with self.assertRaises(ValueError):
+      out_mask.Union(src, mask2)
 
   def testIntersect(self):
     mask1 = field_mask_pb2.FieldMask()
@@ -502,22 +594,98 @@
     nested_src.payload.repeated_int32.append(1234)
     nested_dst.payload.repeated_int32.append(5678)
     # Repeated fields will be appended by default.
-    mask.FromJsonString('payload.repeated_int32')
+    mask.FromJsonString('payload.repeatedInt32')
     mask.MergeMessage(nested_src, nested_dst)
     self.assertEqual(2, len(nested_dst.payload.repeated_int32))
     self.assertEqual(5678, nested_dst.payload.repeated_int32[0])
     self.assertEqual(1234, nested_dst.payload.repeated_int32[1])
     # Change the behavior to replace repeated fields.
-    mask.FromJsonString('payload.repeated_int32')
+    mask.FromJsonString('payload.repeatedInt32')
     mask.MergeMessage(nested_src, nested_dst, False, True)
     self.assertEqual(1, len(nested_dst.payload.repeated_int32))
     self.assertEqual(1234, nested_dst.payload.repeated_int32[0])
 
+    # Test Merge oneof field.
+    new_msg = unittest_pb2.TestOneof2()
+    dst = unittest_pb2.TestOneof2()
+    dst.foo_message.qux_int = 1
+    mask = field_mask_pb2.FieldMask()
+    mask.FromJsonString('fooMessage,fooLazyMessage.quxInt')
+    mask.MergeMessage(new_msg, dst)
+    self.assertTrue(dst.HasField('foo_message'))
+    self.assertFalse(dst.HasField('foo_lazy_message'))
+
+  def testMergeErrors(self):
+    src = unittest_pb2.TestAllTypes()
+    dst = unittest_pb2.TestAllTypes()
+    mask = field_mask_pb2.FieldMask()
+    test_util.SetAllFields(src)
+    mask.FromJsonString('optionalInt32.field')
+    with self.assertRaises(ValueError) as e:
+      mask.MergeMessage(src, dst)
+    self.assertEqual('Error: Field optional_int32 in message '
+                     'protobuf_unittest.TestAllTypes is not a singular '
+                     'message field and cannot have sub-fields.',
+                     str(e.exception))
+
+  def testSnakeCaseToCamelCase(self):
+    self.assertEqual('fooBar',
+                     well_known_types._SnakeCaseToCamelCase('foo_bar'))
+    self.assertEqual('FooBar',
+                     well_known_types._SnakeCaseToCamelCase('_foo_bar'))
+    self.assertEqual('foo3Bar',
+                     well_known_types._SnakeCaseToCamelCase('foo3_bar'))
+
+    # No uppercase letter is allowed.
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        'Fail to print FieldMask to Json string: Path name Foo must '
+        'not contain uppercase letters.',
+        well_known_types._SnakeCaseToCamelCase,
+        'Foo')
+    # Any character after a "_" must be a lowercase letter.
+    #   1. "_" cannot be followed by another "_".
+    #   2. "_" cannot be followed by a digit.
+    #   3. "_" cannot appear as the last character.
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        'Fail to print FieldMask to Json string: The character after a '
+        '"_" must be a lowercase letter in path name foo__bar.',
+        well_known_types._SnakeCaseToCamelCase,
+        'foo__bar')
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        'Fail to print FieldMask to Json string: The character after a '
+        '"_" must be a lowercase letter in path name foo_3bar.',
+        well_known_types._SnakeCaseToCamelCase,
+        'foo_3bar')
+    self.assertRaisesRegexp(
+        well_known_types.Error,
+        'Fail to print FieldMask to Json string: Trailing "_" in path '
+        'name foo_bar_.',
+        well_known_types._SnakeCaseToCamelCase,
+        'foo_bar_')
+
+  def testCamelCaseToSnakeCase(self):
+    self.assertEqual('foo_bar',
+                     well_known_types._CamelCaseToSnakeCase('fooBar'))
+    self.assertEqual('_foo_bar',
+                     well_known_types._CamelCaseToSnakeCase('FooBar'))
+    self.assertEqual('foo3_bar',
+                     well_known_types._CamelCaseToSnakeCase('foo3Bar'))
+    self.assertRaisesRegexp(
+        well_known_types.ParseError,
+        'Fail to parse FieldMask: Path name foo_bar must not contain "_"s.',
+        well_known_types._CamelCaseToSnakeCase,
+        'foo_bar')
+
 
 class StructTest(unittest.TestCase):
 
   def testStruct(self):
     struct = struct_pb2.Struct()
+    self.assertIsInstance(struct, collections.Mapping)
+    self.assertEqual(0, len(struct))
     struct_class = struct.__class__
 
     struct['key1'] = 5
@@ -525,56 +693,157 @@
     struct['key3'] = True
     struct.get_or_create_struct('key4')['subkey'] = 11.0
     struct_list = struct.get_or_create_list('key5')
+    self.assertIsInstance(struct_list, collections.Sequence)
     struct_list.extend([6, 'seven', True, False, None])
     struct_list.add_struct()['subkey2'] = 9
+    struct['key6'] = {'subkey': {}}
+    struct['key7'] = [2, False]
 
+    self.assertEqual(7, len(struct))
     self.assertTrue(isinstance(struct, well_known_types.Struct))
-    self.assertEquals(5, struct['key1'])
-    self.assertEquals('abc', struct['key2'])
+    self.assertEqual(5, struct['key1'])
+    self.assertEqual('abc', struct['key2'])
     self.assertIs(True, struct['key3'])
-    self.assertEquals(11, struct['key4']['subkey'])
+    self.assertEqual(11, struct['key4']['subkey'])
     inner_struct = struct_class()
     inner_struct['subkey2'] = 9
-    self.assertEquals([6, 'seven', True, False, None, inner_struct],
-                      list(struct['key5'].items()))
+    self.assertEqual([6, 'seven', True, False, None, inner_struct],
+                     list(struct['key5'].items()))
+    self.assertEqual({}, dict(struct['key6']['subkey'].fields))
+    self.assertEqual([2, False], list(struct['key7'].items()))
 
     serialized = struct.SerializeToString()
-
     struct2 = struct_pb2.Struct()
     struct2.ParseFromString(serialized)
 
-    self.assertEquals(struct, struct2)
+    self.assertEqual(struct, struct2)
+    for key, value in struct.items():
+      self.assertIn(key, struct)
+      self.assertIn(key, struct2)
+      self.assertEqual(value, struct2[key])
+
+    self.assertEqual(7, len(struct.keys()))
+    self.assertEqual(7, len(struct.values()))
+    for key in struct.keys():
+      self.assertIn(key, struct)
+      self.assertIn(key, struct2)
+      self.assertEqual(struct[key], struct2[key])
+
+    item = (next(iter(struct.keys())), next(iter(struct.values())))
+    self.assertEqual(item, next(iter(struct.items())))
 
     self.assertTrue(isinstance(struct2, well_known_types.Struct))
-    self.assertEquals(5, struct2['key1'])
-    self.assertEquals('abc', struct2['key2'])
+    self.assertEqual(5, struct2['key1'])
+    self.assertEqual('abc', struct2['key2'])
     self.assertIs(True, struct2['key3'])
-    self.assertEquals(11, struct2['key4']['subkey'])
-    self.assertEquals([6, 'seven', True, False, None, inner_struct],
-                      list(struct2['key5'].items()))
+    self.assertEqual(11, struct2['key4']['subkey'])
+    self.assertEqual([6, 'seven', True, False, None, inner_struct],
+                     list(struct2['key5'].items()))
 
     struct_list = struct2['key5']
-    self.assertEquals(6, struct_list[0])
-    self.assertEquals('seven', struct_list[1])
-    self.assertEquals(True, struct_list[2])
-    self.assertEquals(False, struct_list[3])
-    self.assertEquals(None, struct_list[4])
-    self.assertEquals(inner_struct, struct_list[5])
+    self.assertEqual(6, struct_list[0])
+    self.assertEqual('seven', struct_list[1])
+    self.assertEqual(True, struct_list[2])
+    self.assertEqual(False, struct_list[3])
+    self.assertEqual(None, struct_list[4])
+    self.assertEqual(inner_struct, struct_list[5])
 
     struct_list[1] = 7
-    self.assertEquals(7, struct_list[1])
+    self.assertEqual(7, struct_list[1])
 
     struct_list.add_list().extend([1, 'two', True, False, None])
-    self.assertEquals([1, 'two', True, False, None],
-                      list(struct_list[6].items()))
+    self.assertEqual([1, 'two', True, False, None],
+                     list(struct_list[6].items()))
+    struct_list.extend([{'nested_struct': 30}, ['nested_list', 99], {}, []])
+    self.assertEqual(11, len(struct_list.values))
+    self.assertEqual(30, struct_list[7]['nested_struct'])
+    self.assertEqual('nested_list', struct_list[8][0])
+    self.assertEqual(99, struct_list[8][1])
+    self.assertEqual({}, dict(struct_list[9].fields))
+    self.assertEqual([], list(struct_list[10].items()))
+    struct_list[0] = {'replace': 'set'}
+    struct_list[1] = ['replace', 'set']
+    self.assertEqual('set', struct_list[0]['replace'])
+    self.assertEqual(['replace', 'set'], list(struct_list[1].items()))
 
     text_serialized = str(struct)
     struct3 = struct_pb2.Struct()
     text_format.Merge(text_serialized, struct3)
-    self.assertEquals(struct, struct3)
+    self.assertEqual(struct, struct3)
 
     struct.get_or_create_struct('key3')['replace'] = 12
-    self.assertEquals(12, struct['key3']['replace'])
+    self.assertEqual(12, struct['key3']['replace'])
+
+    # Tests empty list.
+    struct.get_or_create_list('empty_list')
+    empty_list = struct['empty_list']
+    self.assertEqual([], list(empty_list.items()))
+    list2 = struct_pb2.ListValue()
+    list2.add_list()
+    empty_list = list2[0]
+    self.assertEqual([], list(empty_list.items()))
+
+    # Tests empty struct.
+    struct.get_or_create_struct('empty_struct')
+    empty_struct = struct['empty_struct']
+    self.assertEqual({}, dict(empty_struct.fields))
+    list2.add_struct()
+    empty_struct = list2[1]
+    self.assertEqual({}, dict(empty_struct.fields))
+
+    self.assertEqual(9, len(struct))
+    del struct['key3']
+    del struct['key4']
+    self.assertEqual(7, len(struct))
+    self.assertEqual(6, len(struct['key5']))
+    del struct['key5'][1]
+    self.assertEqual(5, len(struct['key5']))
+    self.assertEqual([6, True, False, None, inner_struct],
+                     list(struct['key5'].items()))
+
+  def testMergeFrom(self):
+    struct = struct_pb2.Struct()
+    struct_class = struct.__class__
+
+    dictionary = {
+        'key1': 5,
+        'key2': 'abc',
+        'key3': True,
+        'key4': {'subkey': 11.0},
+        'key5': [6, 'seven', True, False, None, {'subkey2': 9}],
+        'key6': [['nested_list', True]],
+        'empty_struct': {},
+        'empty_list': []
+    }
+    struct.update(dictionary)
+    self.assertEqual(5, struct['key1'])
+    self.assertEqual('abc', struct['key2'])
+    self.assertIs(True, struct['key3'])
+    self.assertEqual(11, struct['key4']['subkey'])
+    inner_struct = struct_class()
+    inner_struct['subkey2'] = 9
+    self.assertEqual([6, 'seven', True, False, None, inner_struct],
+                     list(struct['key5'].items()))
+    self.assertEqual(2, len(struct['key6'][0].values))
+    self.assertEqual('nested_list', struct['key6'][0][0])
+    self.assertEqual(True, struct['key6'][0][1])
+    empty_list = struct['empty_list']
+    self.assertEqual([], list(empty_list.items()))
+    empty_struct = struct['empty_struct']
+    self.assertEqual({}, dict(empty_struct.fields))
+
+    # According to documentation: "When parsing from the wire or when merging,
+    # if there are duplicate map keys the last key seen is used".
+    duplicate = {
+        'key4': {'replace': 20},
+        'key5': [[False, 5]]
+    }
+    struct.update(duplicate)
+    self.assertEqual(1, len(struct['key4'].fields))
+    self.assertEqual(20, struct['key4']['replace'])
+    self.assertEqual(1, len(struct['key5'].values))
+    self.assertEqual(False, struct['key5'][0][0])
+    self.assertEqual(5, struct['key5'][0][1])
 
 
 class AnyTest(unittest.TestCase):
@@ -610,6 +879,14 @@
       raise AttributeError('%s should not have Pack method.' %
                            msg_descriptor.full_name)
 
+  def testMessageName(self):
+    # Creates and sets message.
+    submessage = any_test_pb2.TestAny()
+    submessage.int_value = 12345
+    msg = any_pb2.Any()
+    msg.Pack(submessage)
+    self.assertEqual(msg.TypeName(), 'google.protobuf.internal.TestAny')
+
   def testPackWithCustomTypeUrl(self):
     submessage = any_test_pb2.TestAny()
     submessage.int_value = 12345
@@ -631,6 +908,20 @@
     self.assertTrue(msg.Unpack(unpacked_message))
     self.assertEqual(submessage, unpacked_message)
 
+  def testPackDeterministic(self):
+    submessage = any_test_pb2.TestAny()
+    for i in range(10):
+      submessage.map_value[str(i)] = i * 2
+    msg = any_pb2.Any()
+    msg.Pack(submessage, deterministic=True)
+    serialized = msg.SerializeToString(deterministic=True)
+    golden = (b'\n4type.googleapis.com/google.protobuf.internal.TestAny\x12F'
+              b'\x1a\x05\n\x010\x10\x00\x1a\x05\n\x011\x10\x02\x1a\x05\n\x01'
+              b'2\x10\x04\x1a\x05\n\x013\x10\x06\x1a\x05\n\x014\x10\x08\x1a'
+              b'\x05\n\x015\x10\n\x1a\x05\n\x016\x10\x0c\x1a\x05\n\x017\x10'
+              b'\x0e\x1a\x05\n\x018\x10\x10\x1a\x05\n\x019\x10\x12')
+    self.assertEqual(golden, serialized)
+
 
 if __name__ == '__main__':
   unittest.main()
diff --git a/python/google/protobuf/internal/wire_format_test.py b/python/google/protobuf/internal/wire_format_test.py
index f659d18..da120f3 100755
--- a/python/google/protobuf/internal/wire_format_test.py
+++ b/python/google/protobuf/internal/wire_format_test.py
@@ -35,9 +35,10 @@
 __author__ = 'robinson@google.com (Will Robinson)'
 
 try:
-  import unittest2 as unittest
+  import unittest2 as unittest  #PY26
 except ImportError:
   import unittest
+
 from google.protobuf import message
 from google.protobuf.internal import wire_format
 
diff --git a/python/google/protobuf/json_format.py b/python/google/protobuf/json_format.py
index 23382bd..8d338d3 100644
--- a/python/google/protobuf/json_format.py
+++ b/python/google/protobuf/json_format.py
@@ -42,15 +42,28 @@
 
 __author__ = 'jieluo@google.com (Jie Luo)'
 
+# pylint: disable=g-statement-before-imports,g-import-not-at-top
+try:
+  from collections import OrderedDict
+except ImportError:
+  from ordereddict import OrderedDict  # PY26
+# pylint: enable=g-statement-before-imports,g-import-not-at-top
+
 import base64
 import json
 import math
-import six
+
+from operator import methodcaller
+
+import re
 import sys
 
+import six
+
 from google.protobuf import descriptor
 from google.protobuf import symbol_database
 
+
 _TIMESTAMPFOMAT = '%Y-%m-%dT%H:%M:%S'
 _INT_TYPES = frozenset([descriptor.FieldDescriptor.CPPTYPE_INT32,
                         descriptor.FieldDescriptor.CPPTYPE_UINT32,
@@ -64,6 +77,12 @@
 _NEG_INFINITY = '-Infinity'
 _NAN = 'NaN'
 
+_UNPAIRED_SURROGATE_PATTERN = re.compile(six.u(
+    r'[\ud800-\udbff](?![\udc00-\udfff])|(?<![\ud800-\udbff])[\udc00-\udfff]'
+))
+
+_VALID_EXTENSION_NAME = re.compile(r'\[[a-zA-Z0-9\._]*\]$')
+
 
 class Error(Exception):
   """Top-level module error for json_format."""
@@ -77,7 +96,12 @@
   """Thrown in case of parsing error."""
 
 
-def MessageToJson(message, including_default_value_fields=False):
+def MessageToJson(message,
+                  including_default_value_fields=False,
+                  preserving_proto_field_name=False,
+                  indent=2,
+                  sort_keys=False,
+                  use_integers_for_enums=False):
   """Converts protobuf message to JSON format.
 
   Args:
@@ -86,26 +110,50 @@
         repeated fields, and map fields will always be serialized.  If
         False, only serialize non-empty fields.  Singular message fields
         and oneof fields are not affected by this option.
+    preserving_proto_field_name: If True, use the original proto field
+        names as defined in the .proto file. If False, convert the field
+        names to lowerCamelCase.
+    indent: The JSON object will be pretty-printed with this indent level.
+        An indent level of 0 or negative will only insert newlines.
+    sort_keys: If True, then the output will be sorted by field names.
+    use_integers_for_enums: If true, print integers instead of enum names.
 
   Returns:
     A string containing the JSON formatted protocol buffer message.
   """
-  js = _MessageToJsonObject(message, including_default_value_fields)
-  return json.dumps(js, indent=2)
+  printer = _Printer(including_default_value_fields,
+                     preserving_proto_field_name,
+                     use_integers_for_enums)
+  return printer.ToJsonString(message, indent, sort_keys)
 
 
-def _MessageToJsonObject(message, including_default_value_fields):
-  """Converts message to an object according to Proto3 JSON Specification."""
-  message_descriptor = message.DESCRIPTOR
-  full_name = message_descriptor.full_name
-  if _IsWrapperMessage(message_descriptor):
-    return _WrapperMessageToJsonObject(message)
-  if full_name in _WKTJSONMETHODS:
-    return _WKTJSONMETHODS[full_name][0](
-        message, including_default_value_fields)
-  js = {}
-  return _RegularMessageToJsonObject(
-      message, js, including_default_value_fields)
+def MessageToDict(message,
+                  including_default_value_fields=False,
+                  preserving_proto_field_name=False,
+                  use_integers_for_enums=False):
+  """Converts protobuf message to a dictionary.
+
+  When the dictionary is encoded to JSON, it conforms to proto3 JSON spec.
+
+  Args:
+    message: The protocol buffers message instance to serialize.
+    including_default_value_fields: If True, singular primitive fields,
+        repeated fields, and map fields will always be serialized.  If
+        False, only serialize non-empty fields.  Singular message fields
+        and oneof fields are not affected by this option.
+    preserving_proto_field_name: If True, use the original proto field
+        names as defined in the .proto file. If False, convert the field
+        names to lowerCamelCase.
+    use_integers_for_enums: If true, print integers instead of enum names.
+
+  Returns:
+    A dict representation of the protocol buffer message.
+  """
+  printer = _Printer(including_default_value_fields,
+                     preserving_proto_field_name,
+                     use_integers_for_enums)
+  # pylint: disable=protected-access
+  return printer._MessageToJsonObject(message)
 
 
 def _IsMapEntry(field):
@@ -114,114 +162,208 @@
           field.message_type.GetOptions().map_entry)
 
 
-def _RegularMessageToJsonObject(message, js, including_default_value_fields):
-  """Converts normal message according to Proto3 JSON Specification."""
-  fields = message.ListFields()
-  include_default = including_default_value_fields
+class _Printer(object):
+  """JSON format printer for protocol message."""
 
-  try:
-    for field, value in fields:
-      name = field.camelcase_name
-      if _IsMapEntry(field):
-        # Convert a map field.
-        v_field = field.message_type.fields_by_name['value']
-        js_map = {}
-        for key in value:
-          if isinstance(key, bool):
-            if key:
-              recorded_key = 'true'
-            else:
-              recorded_key = 'false'
-          else:
-            recorded_key = key
-          js_map[recorded_key] = _FieldToJsonObject(
-              v_field, value[key], including_default_value_fields)
-        js[name] = js_map
-      elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
-        # Convert a repeated field.
-        js[name] = [_FieldToJsonObject(field, k, include_default)
-                    for k in value]
-      else:
-        js[name] = _FieldToJsonObject(field, value, include_default)
+  def __init__(self,
+               including_default_value_fields=False,
+               preserving_proto_field_name=False,
+               use_integers_for_enums=False):
+    self.including_default_value_fields = including_default_value_fields
+    self.preserving_proto_field_name = preserving_proto_field_name
+    self.use_integers_for_enums = use_integers_for_enums
 
-    # Serialize default value if including_default_value_fields is True.
-    if including_default_value_fields:
-      message_descriptor = message.DESCRIPTOR
-      for field in message_descriptor.fields:
-        # Singular message fields and oneof fields will not be affected.
-        if ((field.label != descriptor.FieldDescriptor.LABEL_REPEATED and
-             field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE) or
-            field.containing_oneof):
-          continue
-        name = field.camelcase_name
-        if name in js:
-          # Skip the field which has been serailized already.
-          continue
-        if _IsMapEntry(field):
-          js[name] = {}
-        elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
-          js[name] = []
+  def ToJsonString(self, message, indent, sort_keys):
+    js = self._MessageToJsonObject(message)
+    return json.dumps(js, indent=indent, sort_keys=sort_keys)
+
+  def _MessageToJsonObject(self, message):
+    """Converts message to an object according to Proto3 JSON Specification."""
+    message_descriptor = message.DESCRIPTOR
+    full_name = message_descriptor.full_name
+    if _IsWrapperMessage(message_descriptor):
+      return self._WrapperMessageToJsonObject(message)
+    if full_name in _WKTJSONMETHODS:
+      return methodcaller(_WKTJSONMETHODS[full_name][0], message)(self)
+    js = {}
+    return self._RegularMessageToJsonObject(message, js)
+
+  def _RegularMessageToJsonObject(self, message, js):
+    """Converts normal message according to Proto3 JSON Specification."""
+    fields = message.ListFields()
+
+    try:
+      for field, value in fields:
+        if self.preserving_proto_field_name:
+          name = field.name
         else:
-          js[name] = _FieldToJsonObject(field, field.default_value)
+          name = field.json_name
+        if _IsMapEntry(field):
+          # Convert a map field.
+          v_field = field.message_type.fields_by_name['value']
+          js_map = {}
+          for key in value:
+            if isinstance(key, bool):
+              if key:
+                recorded_key = 'true'
+              else:
+                recorded_key = 'false'
+            else:
+              recorded_key = key
+            js_map[recorded_key] = self._FieldToJsonObject(
+                v_field, value[key])
+          js[name] = js_map
+        elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
+          # Convert a repeated field.
+          js[name] = [self._FieldToJsonObject(field, k)
+                      for k in value]
+        elif field.is_extension:
+          f = field
+          if (f.containing_type.GetOptions().message_set_wire_format and
+              f.type == descriptor.FieldDescriptor.TYPE_MESSAGE and
+              f.label == descriptor.FieldDescriptor.LABEL_OPTIONAL):
+            f = f.message_type
+          name = '[%s.%s]' % (f.full_name, name)
+          js[name] = self._FieldToJsonObject(field, value)
+        else:
+          js[name] = self._FieldToJsonObject(field, value)
 
-  except ValueError as e:
-    raise SerializeToJsonError(
-        'Failed to serialize {0} field: {1}.'.format(field.name, e))
+      # Serialize default value if including_default_value_fields is True.
+      if self.including_default_value_fields:
+        message_descriptor = message.DESCRIPTOR
+        for field in message_descriptor.fields:
+          # Singular message fields and oneof fields will not be affected.
+          if ((field.label != descriptor.FieldDescriptor.LABEL_REPEATED and
+               field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE) or
+              field.containing_oneof):
+            continue
+          if self.preserving_proto_field_name:
+            name = field.name
+          else:
+            name = field.json_name
+          if name in js:
+            # Skip the field which has been serailized already.
+            continue
+          if _IsMapEntry(field):
+            js[name] = {}
+          elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
+            js[name] = []
+          else:
+            js[name] = self._FieldToJsonObject(field, field.default_value)
 
-  return js
+    except ValueError as e:
+      raise SerializeToJsonError(
+          'Failed to serialize {0} field: {1}.'.format(field.name, e))
 
+    return js
 
-def _FieldToJsonObject(
-    field, value, including_default_value_fields=False):
-  """Converts field value according to Proto3 JSON Specification."""
-  if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-    return _MessageToJsonObject(value, including_default_value_fields)
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
-    enum_value = field.enum_type.values_by_number.get(value, None)
-    if enum_value is not None:
-      return enum_value.name
-    else:
-      raise SerializeToJsonError('Enum field contains an integer value '
-                                 'which can not mapped to an enum value.')
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
-    if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
-      # Use base64 Data encoding for bytes
-      return base64.b64encode(value).decode('utf-8')
-    else:
-      return value
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
-    return bool(value)
-  elif field.cpp_type in _INT64_TYPES:
-    return str(value)
-  elif field.cpp_type in _FLOAT_TYPES:
-    if math.isinf(value):
-      if value < 0.0:
-        return _NEG_INFINITY
+  def _FieldToJsonObject(self, field, value):
+    """Converts field value according to Proto3 JSON Specification."""
+    if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+      return self._MessageToJsonObject(value)
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
+      if self.use_integers_for_enums:
+        return value
+      enum_value = field.enum_type.values_by_number.get(value, None)
+      if enum_value is not None:
+        return enum_value.name
       else:
-        return _INFINITY
-    if math.isnan(value):
-      return _NAN
-  return value
+        if field.file.syntax == 'proto3':
+          return value
+        raise SerializeToJsonError('Enum field contains an integer value '
+                                   'which can not mapped to an enum value.')
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
+      if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
+        # Use base64 Data encoding for bytes
+        return base64.b64encode(value).decode('utf-8')
+      else:
+        return value
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
+      return bool(value)
+    elif field.cpp_type in _INT64_TYPES:
+      return str(value)
+    elif field.cpp_type in _FLOAT_TYPES:
+      if math.isinf(value):
+        if value < 0.0:
+          return _NEG_INFINITY
+        else:
+          return _INFINITY
+      if math.isnan(value):
+        return _NAN
+    return value
+
+  def _AnyMessageToJsonObject(self, message):
+    """Converts Any message according to Proto3 JSON Specification."""
+    if not message.ListFields():
+      return {}
+    # Must print @type first, use OrderedDict instead of {}
+    js = OrderedDict()
+    type_url = message.type_url
+    js['@type'] = type_url
+    sub_message = _CreateMessageFromTypeUrl(type_url)
+    sub_message.ParseFromString(message.value)
+    message_descriptor = sub_message.DESCRIPTOR
+    full_name = message_descriptor.full_name
+    if _IsWrapperMessage(message_descriptor):
+      js['value'] = self._WrapperMessageToJsonObject(sub_message)
+      return js
+    if full_name in _WKTJSONMETHODS:
+      js['value'] = methodcaller(_WKTJSONMETHODS[full_name][0],
+                                 sub_message)(self)
+      return js
+    return self._RegularMessageToJsonObject(sub_message, js)
+
+  def _GenericMessageToJsonObject(self, message):
+    """Converts message according to Proto3 JSON Specification."""
+    # Duration, Timestamp and FieldMask have ToJsonString method to do the
+    # convert. Users can also call the method directly.
+    return message.ToJsonString()
+
+  def _ValueMessageToJsonObject(self, message):
+    """Converts Value message according to Proto3 JSON Specification."""
+    which = message.WhichOneof('kind')
+    # If the Value message is not set treat as null_value when serialize
+    # to JSON. The parse back result will be different from original message.
+    if which is None or which == 'null_value':
+      return None
+    if which == 'list_value':
+      return self._ListValueMessageToJsonObject(message.list_value)
+    if which == 'struct_value':
+      value = message.struct_value
+    else:
+      value = getattr(message, which)
+    oneof_descriptor = message.DESCRIPTOR.fields_by_name[which]
+    return self._FieldToJsonObject(oneof_descriptor, value)
+
+  def _ListValueMessageToJsonObject(self, message):
+    """Converts ListValue message according to Proto3 JSON Specification."""
+    return [self._ValueMessageToJsonObject(value)
+            for value in message.values]
+
+  def _StructMessageToJsonObject(self, message):
+    """Converts Struct message according to Proto3 JSON Specification."""
+    fields = message.fields
+    ret = {}
+    for key in fields:
+      ret[key] = self._ValueMessageToJsonObject(fields[key])
+    return ret
+
+  def _WrapperMessageToJsonObject(self, message):
+    return self._FieldToJsonObject(
+        message.DESCRIPTOR.fields_by_name['value'], message.value)
 
 
-def _AnyMessageToJsonObject(message, including_default):
-  """Converts Any message according to Proto3 JSON Specification."""
-  if not message.ListFields():
-    return {}
-  js = {}
-  type_url = message.type_url
-  js['@type'] = type_url
-  sub_message = _CreateMessageFromTypeUrl(type_url)
-  sub_message.ParseFromString(message.value)
-  message_descriptor = sub_message.DESCRIPTOR
-  full_name = message_descriptor.full_name
-  if _IsWrapperMessage(message_descriptor):
-    js['value'] = _WrapperMessageToJsonObject(sub_message)
-    return js
-  if full_name in _WKTJSONMETHODS:
-    js['value'] = _WKTJSONMETHODS[full_name][0](sub_message, including_default)
-    return js
-  return _RegularMessageToJsonObject(sub_message, js, including_default)
+def _IsWrapperMessage(message_descriptor):
+  return message_descriptor.file.name == 'google/protobuf/wrappers.proto'
+
+
+def _DuplicateChecker(js):
+  result = {}
+  for name, value in js:
+    if name in result:
+      raise ParseError('Failed to load JSON: duplicate key {0}.'.format(name))
+    result[name] = value
+  return result
 
 
 def _CreateMessageFromTypeUrl(type_url):
@@ -238,69 +380,13 @@
   return message_class()
 
 
-def _GenericMessageToJsonObject(message, unused_including_default):
-  """Converts message by ToJsonString according to Proto3 JSON Specification."""
-  # Duration, Timestamp and FieldMask have ToJsonString method to do the
-  # convert. Users can also call the method directly.
-  return message.ToJsonString()
-
-
-def _ValueMessageToJsonObject(message, unused_including_default=False):
-  """Converts Value message according to Proto3 JSON Specification."""
-  which = message.WhichOneof('kind')
-  # If the Value message is not set treat as null_value when serialize
-  # to JSON. The parse back result will be different from original message.
-  if which is None or which == 'null_value':
-    return None
-  if which == 'list_value':
-    return _ListValueMessageToJsonObject(message.list_value)
-  if which == 'struct_value':
-    value = message.struct_value
-  else:
-    value = getattr(message, which)
-  oneof_descriptor = message.DESCRIPTOR.fields_by_name[which]
-  return _FieldToJsonObject(oneof_descriptor, value)
-
-
-def _ListValueMessageToJsonObject(message, unused_including_default=False):
-  """Converts ListValue message according to Proto3 JSON Specification."""
-  return [_ValueMessageToJsonObject(value)
-          for value in message.values]
-
-
-def _StructMessageToJsonObject(message, unused_including_default=False):
-  """Converts Struct message according to Proto3 JSON Specification."""
-  fields = message.fields
-  js = {}
-  for key in fields.keys():
-    js[key] = _ValueMessageToJsonObject(fields[key])
-  return js
-
-
-def _IsWrapperMessage(message_descriptor):
-  return message_descriptor.file.name == 'google/protobuf/wrappers.proto'
-
-
-def _WrapperMessageToJsonObject(message):
-  return _FieldToJsonObject(
-      message.DESCRIPTOR.fields_by_name['value'], message.value)
-
-
-def _DuplicateChecker(js):
-  result = {}
-  for name, value in js:
-    if name in result:
-      raise ParseError('Failed to load JSON: duplicate key {0}.'.format(name))
-    result[name] = value
-  return result
-
-
-def Parse(text, message):
+def Parse(text, message, ignore_unknown_fields=False):
   """Parses a JSON representation of a protocol message into a message.
 
   Args:
     text: Message JSON representation.
-    message: A protocol beffer message to merge into.
+    message: A protocol buffer message to merge into.
+    ignore_unknown_fields: If True, do not raise errors for unknown fields.
 
   Returns:
     The same message passed as argument.
@@ -317,213 +403,255 @@
       js = json.loads(text, object_pairs_hook=_DuplicateChecker)
   except ValueError as e:
     raise ParseError('Failed to load JSON: {0}.'.format(str(e)))
-  _ConvertMessage(js, message)
+  return ParseDict(js, message, ignore_unknown_fields)
+
+
+def ParseDict(js_dict, message, ignore_unknown_fields=False):
+  """Parses a JSON dictionary representation into a message.
+
+  Args:
+    js_dict: Dict representation of a JSON message.
+    message: A protocol buffer message to merge into.
+    ignore_unknown_fields: If True, do not raise errors for unknown fields.
+
+  Returns:
+    The same message passed as argument.
+  """
+  parser = _Parser(ignore_unknown_fields)
+  parser.ConvertMessage(js_dict, message)
   return message
 
 
-def _ConvertFieldValuePair(js, message):
-  """Convert field value pairs into regular message.
-
-  Args:
-    js: A JSON object to convert the field value pairs.
-    message: A regular protocol message to record the data.
-
-  Raises:
-    ParseError: In case of problems converting.
-  """
-  names = []
-  message_descriptor = message.DESCRIPTOR
-  for name in js:
-    try:
-      field = message_descriptor.fields_by_camelcase_name.get(name, None)
-      if not field:
-        raise ParseError(
-            'Message type "{0}" has no field named "{1}".'.format(
-                message_descriptor.full_name, name))
-      if name in names:
-        raise ParseError(
-            'Message type "{0}" should not have multiple "{1}" fields.'.format(
-                message.DESCRIPTOR.full_name, name))
-      names.append(name)
-      # Check no other oneof field is parsed.
-      if field.containing_oneof is not None:
-        oneof_name = field.containing_oneof.name
-        if oneof_name in names:
-          raise ParseError('Message type "{0}" should not have multiple "{1}" '
-                           'oneof fields.'.format(
-                               message.DESCRIPTOR.full_name, oneof_name))
-        names.append(oneof_name)
-
-      value = js[name]
-      if value is None:
-        message.ClearField(field.name)
-        continue
-
-      # Parse field value.
-      if _IsMapEntry(field):
-        message.ClearField(field.name)
-        _ConvertMapFieldValue(value, message, field)
-      elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
-        message.ClearField(field.name)
-        if not isinstance(value, list):
-          raise ParseError('repeated field {0} must be in [] which is '
-                           '{1}.'.format(name, value))
-        if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-          # Repeated message field.
-          for item in value:
-            sub_message = getattr(message, field.name).add()
-            # None is a null_value in Value.
-            if (item is None and
-                sub_message.DESCRIPTOR.full_name != 'google.protobuf.Value'):
-              raise ParseError('null is not allowed to be used as an element'
-                               ' in a repeated field.')
-            _ConvertMessage(item, sub_message)
-        else:
-          # Repeated scalar field.
-          for item in value:
-            if item is None:
-              raise ParseError('null is not allowed to be used as an element'
-                               ' in a repeated field.')
-            getattr(message, field.name).append(
-                _ConvertScalarFieldValue(item, field))
-      elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-        sub_message = getattr(message, field.name)
-        _ConvertMessage(value, sub_message)
-      else:
-        setattr(message, field.name, _ConvertScalarFieldValue(value, field))
-    except ParseError as e:
-      if field and field.containing_oneof is None:
-        raise ParseError('Failed to parse {0} field: {1}'.format(name, e))
-      else:
-        raise ParseError(str(e))
-    except ValueError as e:
-      raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
-    except TypeError as e:
-      raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
-
-
-def _ConvertMessage(value, message):
-  """Convert a JSON object into a message.
-
-  Args:
-    value: A JSON object.
-    message: A WKT or regular protocol message to record the data.
-
-  Raises:
-    ParseError: In case of convert problems.
-  """
-  message_descriptor = message.DESCRIPTOR
-  full_name = message_descriptor.full_name
-  if _IsWrapperMessage(message_descriptor):
-    _ConvertWrapperMessage(value, message)
-  elif full_name in _WKTJSONMETHODS:
-    _WKTJSONMETHODS[full_name][1](value, message)
-  else:
-    _ConvertFieldValuePair(value, message)
-
-
-def _ConvertAnyMessage(value, message):
-  """Convert a JSON representation into Any message."""
-  if isinstance(value, dict) and not value:
-    return
-  try:
-    type_url = value['@type']
-  except KeyError:
-    raise ParseError('@type is missing when parsing any message.')
-
-  sub_message = _CreateMessageFromTypeUrl(type_url)
-  message_descriptor = sub_message.DESCRIPTOR
-  full_name = message_descriptor.full_name
-  if _IsWrapperMessage(message_descriptor):
-    _ConvertWrapperMessage(value['value'], sub_message)
-  elif full_name in _WKTJSONMETHODS:
-    _WKTJSONMETHODS[full_name][1](value['value'], sub_message)
-  else:
-    del value['@type']
-    _ConvertFieldValuePair(value, sub_message)
-  # Sets Any message
-  message.value = sub_message.SerializeToString()
-  message.type_url = type_url
-
-
-def _ConvertGenericMessage(value, message):
-  """Convert a JSON representation into message with FromJsonString."""
-  # Durantion, Timestamp, FieldMask have FromJsonString method to do the
-  # convert. Users can also call the method directly.
-  message.FromJsonString(value)
-
-
 _INT_OR_FLOAT = six.integer_types + (float,)
 
 
-def _ConvertValueMessage(value, message):
-  """Convert a JSON representation into Value message."""
-  if isinstance(value, dict):
-    _ConvertStructMessage(value, message.struct_value)
-  elif isinstance(value, list):
-    _ConvertListValueMessage(value, message.list_value)
-  elif value is None:
-    message.null_value = 0
-  elif isinstance(value, bool):
-    message.bool_value = value
-  elif isinstance(value, six.string_types):
-    message.string_value = value
-  elif isinstance(value, _INT_OR_FLOAT):
-    message.number_value = value
-  else:
-    raise ParseError('Unexpected type for Value message.')
+class _Parser(object):
+  """JSON format parser for protocol message."""
 
+  def __init__(self,
+               ignore_unknown_fields):
+    self.ignore_unknown_fields = ignore_unknown_fields
 
-def _ConvertListValueMessage(value, message):
-  """Convert a JSON representation into ListValue message."""
-  if not isinstance(value, list):
-    raise ParseError(
-        'ListValue must be in [] which is {0}.'.format(value))
-  message.ClearField('values')
-  for item in value:
-    _ConvertValueMessage(item, message.values.add())
+  def ConvertMessage(self, value, message):
+    """Convert a JSON object into a message.
 
+    Args:
+      value: A JSON object.
+      message: A WKT or regular protocol message to record the data.
 
-def _ConvertStructMessage(value, message):
-  """Convert a JSON representation into Struct message."""
-  if not isinstance(value, dict):
-    raise ParseError(
-        'Struct must be in a dict which is {0}.'.format(value))
-  for key in value:
-    _ConvertValueMessage(value[key], message.fields[key])
-  return
-
-
-def _ConvertWrapperMessage(value, message):
-  """Convert a JSON representation into Wrapper message."""
-  field = message.DESCRIPTOR.fields_by_name['value']
-  setattr(message, 'value', _ConvertScalarFieldValue(value, field))
-
-
-def _ConvertMapFieldValue(value, message, field):
-  """Convert map field value for a message map field.
-
-  Args:
-    value: A JSON object to convert the map field value.
-    message: A protocol message to record the converted data.
-    field: The descriptor of the map field to be converted.
-
-  Raises:
-    ParseError: In case of convert problems.
-  """
-  if not isinstance(value, dict):
-    raise ParseError(
-        'Map field {0} must be in a dict which is {1}.'.format(
-            field.name, value))
-  key_field = field.message_type.fields_by_name['key']
-  value_field = field.message_type.fields_by_name['value']
-  for key in value:
-    key_value = _ConvertScalarFieldValue(key, key_field, True)
-    if value_field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-      _ConvertMessage(value[key], getattr(message, field.name)[key_value])
+    Raises:
+      ParseError: In case of convert problems.
+    """
+    message_descriptor = message.DESCRIPTOR
+    full_name = message_descriptor.full_name
+    if _IsWrapperMessage(message_descriptor):
+      self._ConvertWrapperMessage(value, message)
+    elif full_name in _WKTJSONMETHODS:
+      methodcaller(_WKTJSONMETHODS[full_name][1], value, message)(self)
     else:
-      getattr(message, field.name)[key_value] = _ConvertScalarFieldValue(
-          value[key], value_field)
+      self._ConvertFieldValuePair(value, message)
+
+  def _ConvertFieldValuePair(self, js, message):
+    """Convert field value pairs into regular message.
+
+    Args:
+      js: A JSON object to convert the field value pairs.
+      message: A regular protocol message to record the data.
+
+    Raises:
+      ParseError: In case of problems converting.
+    """
+    names = []
+    message_descriptor = message.DESCRIPTOR
+    fields_by_json_name = dict((f.json_name, f)
+                               for f in message_descriptor.fields)
+    for name in js:
+      try:
+        field = fields_by_json_name.get(name, None)
+        if not field:
+          field = message_descriptor.fields_by_name.get(name, None)
+        if not field and _VALID_EXTENSION_NAME.match(name):
+          if not message_descriptor.is_extendable:
+            raise ParseError('Message type {0} does not have extensions'.format(
+                message_descriptor.full_name))
+          identifier = name[1:-1]  # strip [] brackets
+          identifier = '.'.join(identifier.split('.')[:-1])
+          # pylint: disable=protected-access
+          field = message.Extensions._FindExtensionByName(identifier)
+          # pylint: enable=protected-access
+        if not field:
+          if self.ignore_unknown_fields:
+            continue
+          raise ParseError(
+              ('Message type "{0}" has no field named "{1}".\n'
+               ' Available Fields(except extensions): {2}').format(
+                   message_descriptor.full_name, name,
+                   message_descriptor.fields))
+        if name in names:
+          raise ParseError('Message type "{0}" should not have multiple '
+                           '"{1}" fields.'.format(
+                               message.DESCRIPTOR.full_name, name))
+        names.append(name)
+        # Check no other oneof field is parsed.
+        if field.containing_oneof is not None:
+          oneof_name = field.containing_oneof.name
+          if oneof_name in names:
+            raise ParseError('Message type "{0}" should not have multiple '
+                             '"{1}" oneof fields.'.format(
+                                 message.DESCRIPTOR.full_name, oneof_name))
+          names.append(oneof_name)
+
+        value = js[name]
+        if value is None:
+          if (field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE
+              and field.message_type.full_name == 'google.protobuf.Value'):
+            sub_message = getattr(message, field.name)
+            sub_message.null_value = 0
+          else:
+            message.ClearField(field.name)
+          continue
+
+        # Parse field value.
+        if _IsMapEntry(field):
+          message.ClearField(field.name)
+          self._ConvertMapFieldValue(value, message, field)
+        elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
+          message.ClearField(field.name)
+          if not isinstance(value, list):
+            raise ParseError('repeated field {0} must be in [] which is '
+                             '{1}.'.format(name, value))
+          if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+            # Repeated message field.
+            for item in value:
+              sub_message = getattr(message, field.name).add()
+              # None is a null_value in Value.
+              if (item is None and
+                  sub_message.DESCRIPTOR.full_name != 'google.protobuf.Value'):
+                raise ParseError('null is not allowed to be used as an element'
+                                 ' in a repeated field.')
+              self.ConvertMessage(item, sub_message)
+          else:
+            # Repeated scalar field.
+            for item in value:
+              if item is None:
+                raise ParseError('null is not allowed to be used as an element'
+                                 ' in a repeated field.')
+              getattr(message, field.name).append(
+                  _ConvertScalarFieldValue(item, field))
+        elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+          if field.is_extension:
+            sub_message = message.Extensions[field]
+          else:
+            sub_message = getattr(message, field.name)
+          sub_message.SetInParent()
+          self.ConvertMessage(value, sub_message)
+        else:
+          setattr(message, field.name, _ConvertScalarFieldValue(value, field))
+      except ParseError as e:
+        if field and field.containing_oneof is None:
+          raise ParseError('Failed to parse {0} field: {1}'.format(name, e))
+        else:
+          raise ParseError(str(e))
+      except ValueError as e:
+        raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
+      except TypeError as e:
+        raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
+
+  def _ConvertAnyMessage(self, value, message):
+    """Convert a JSON representation into Any message."""
+    if isinstance(value, dict) and not value:
+      return
+    try:
+      type_url = value['@type']
+    except KeyError:
+      raise ParseError('@type is missing when parsing any message.')
+
+    sub_message = _CreateMessageFromTypeUrl(type_url)
+    message_descriptor = sub_message.DESCRIPTOR
+    full_name = message_descriptor.full_name
+    if _IsWrapperMessage(message_descriptor):
+      self._ConvertWrapperMessage(value['value'], sub_message)
+    elif full_name in _WKTJSONMETHODS:
+      methodcaller(
+          _WKTJSONMETHODS[full_name][1], value['value'], sub_message)(self)
+    else:
+      del value['@type']
+      self._ConvertFieldValuePair(value, sub_message)
+    # Sets Any message
+    message.value = sub_message.SerializeToString()
+    message.type_url = type_url
+
+  def _ConvertGenericMessage(self, value, message):
+    """Convert a JSON representation into message with FromJsonString."""
+    # Duration, Timestamp, FieldMask have a FromJsonString method to do the
+    # conversion. Users can also call the method directly.
+    message.FromJsonString(value)
+
+  def _ConvertValueMessage(self, value, message):
+    """Convert a JSON representation into Value message."""
+    if isinstance(value, dict):
+      self._ConvertStructMessage(value, message.struct_value)
+    elif isinstance(value, list):
+      self. _ConvertListValueMessage(value, message.list_value)
+    elif value is None:
+      message.null_value = 0
+    elif isinstance(value, bool):
+      message.bool_value = value
+    elif isinstance(value, six.string_types):
+      message.string_value = value
+    elif isinstance(value, _INT_OR_FLOAT):
+      message.number_value = value
+    else:
+      raise ParseError('Unexpected type for Value message.')
+
+  def _ConvertListValueMessage(self, value, message):
+    """Convert a JSON representation into ListValue message."""
+    if not isinstance(value, list):
+      raise ParseError(
+          'ListValue must be in [] which is {0}.'.format(value))
+    message.ClearField('values')
+    for item in value:
+      self._ConvertValueMessage(item, message.values.add())
+
+  def _ConvertStructMessage(self, value, message):
+    """Convert a JSON representation into Struct message."""
+    if not isinstance(value, dict):
+      raise ParseError(
+          'Struct must be in a dict which is {0}.'.format(value))
+    for key in value:
+      self._ConvertValueMessage(value[key], message.fields[key])
+    return
+
+  def _ConvertWrapperMessage(self, value, message):
+    """Convert a JSON representation into Wrapper message."""
+    field = message.DESCRIPTOR.fields_by_name['value']
+    setattr(message, 'value', _ConvertScalarFieldValue(value, field))
+
+  def _ConvertMapFieldValue(self, value, message, field):
+    """Convert map field value for a message map field.
+
+    Args:
+      value: A JSON object to convert the map field value.
+      message: A protocol message to record the converted data.
+      field: The descriptor of the map field to be converted.
+
+    Raises:
+      ParseError: In case of convert problems.
+    """
+    if not isinstance(value, dict):
+      raise ParseError(
+          'Map field {0} must be in a dict which is {1}.'.format(
+              field.name, value))
+    key_field = field.message_type.fields_by_name['key']
+    value_field = field.message_type.fields_by_name['value']
+    for key in value:
+      key_value = _ConvertScalarFieldValue(key, key_field, True)
+      if value_field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+        self.ConvertMessage(value[key], getattr(
+            message, field.name)[key_value])
+      else:
+        getattr(message, field.name)[key_value] = _ConvertScalarFieldValue(
+            value[key], value_field)
 
 
 def _ConvertScalarFieldValue(value, field, require_str=False):
@@ -550,15 +678,27 @@
     if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
       return base64.b64decode(value)
     else:
+      # Checking for unpaired surrogates appears to be unreliable,
+      # depending on the specific Python version, so we check manually.
+      if _UNPAIRED_SURROGATE_PATTERN.search(value):
+        raise ParseError('Unpaired surrogate')
       return value
   elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
     # Convert an enum value.
     enum_value = field.enum_type.values_by_name.get(value, None)
     if enum_value is None:
-      raise ParseError(
-          'Enum value must be a string literal with double quotes. '
-          'Type "{0}" has no value named {1}.'.format(
-              field.enum_type.full_name, value))
+      try:
+        number = int(value)
+        enum_value = field.enum_type.values_by_number.get(number, None)
+      except ValueError:
+        raise ParseError('Invalid enum value {0} for enum type {1}.'.format(
+            value, field.enum_type.full_name))
+      if enum_value is None:
+        if field.file.syntax == 'proto3':
+          # Proto3 accepts unknown enums.
+          return number
+        raise ParseError('Invalid enum value {0} for enum type {1}.'.format(
+            value, field.enum_type.full_name))
     return enum_value.number
 
 
@@ -574,7 +714,7 @@
   Raises:
     ParseError: If an integer couldn't be consumed.
   """
-  if isinstance(value, float):
+  if isinstance(value, float) and not value.is_integer():
     raise ParseError('Couldn\'t parse integer: {0}.'.format(value))
 
   if isinstance(value, six.text_type) and value.find(' ') != -1:
@@ -628,18 +768,18 @@
   return value
 
 _WKTJSONMETHODS = {
-    'google.protobuf.Any': [_AnyMessageToJsonObject,
-                            _ConvertAnyMessage],
-    'google.protobuf.Duration': [_GenericMessageToJsonObject,
-                                 _ConvertGenericMessage],
-    'google.protobuf.FieldMask': [_GenericMessageToJsonObject,
-                                  _ConvertGenericMessage],
-    'google.protobuf.ListValue': [_ListValueMessageToJsonObject,
-                                  _ConvertListValueMessage],
-    'google.protobuf.Struct': [_StructMessageToJsonObject,
-                               _ConvertStructMessage],
-    'google.protobuf.Timestamp': [_GenericMessageToJsonObject,
-                                  _ConvertGenericMessage],
-    'google.protobuf.Value': [_ValueMessageToJsonObject,
-                              _ConvertValueMessage]
+    'google.protobuf.Any': ['_AnyMessageToJsonObject',
+                            '_ConvertAnyMessage'],
+    'google.protobuf.Duration': ['_GenericMessageToJsonObject',
+                                 '_ConvertGenericMessage'],
+    'google.protobuf.FieldMask': ['_GenericMessageToJsonObject',
+                                  '_ConvertGenericMessage'],
+    'google.protobuf.ListValue': ['_ListValueMessageToJsonObject',
+                                  '_ConvertListValueMessage'],
+    'google.protobuf.Struct': ['_StructMessageToJsonObject',
+                               '_ConvertStructMessage'],
+    'google.protobuf.Timestamp': ['_GenericMessageToJsonObject',
+                                  '_ConvertGenericMessage'],
+    'google.protobuf.Value': ['_ValueMessageToJsonObject',
+                              '_ConvertValueMessage']
 }
diff --git a/python/google/protobuf/message.py b/python/google/protobuf/message.py
index de2f569..eeb0d57 100755
--- a/python/google/protobuf/message.py
+++ b/python/google/protobuf/message.py
@@ -184,9 +184,15 @@
     self.Clear()
     self.MergeFromString(serialized)
 
-  def SerializeToString(self):
+  def SerializeToString(self, **kwargs):
     """Serializes the protocol message to a binary string.
 
+    Arguments:
+      **kwargs: Keyword arguments to the serialize method, accepts
+        the following keyword args:
+        deterministic: If true, requests deterministic serialization of the
+          protobuf, with predictable ordering of map keys.
+
     Returns:
       A binary string representation of the message if all of the required
       fields in the message are set (i.e. the message is initialized).
@@ -196,12 +202,18 @@
     """
     raise NotImplementedError
 
-  def SerializePartialToString(self):
+  def SerializePartialToString(self, **kwargs):
     """Serializes the protocol message to a binary string.
 
     This method is similar to SerializeToString but doesn't check if the
     message is initialized.
 
+    Arguments:
+      **kwargs: Keyword arguments to the serialize method, accepts
+        the following keyword args:
+        deterministic: If true, requests deterministic serialization of the
+          protobuf, with predictable ordering of map keys.
+
     Returns:
       A string representation of the partial message.
     """
@@ -225,10 +237,11 @@
   # """
   def ListFields(self):
     """Returns a list of (FieldDescriptor, value) tuples for all
-    fields in the message which are not empty.  A singular field is non-empty
-    if HasField() would return true, and a repeated field is non-empty if
-    it contains at least one element.  The fields are ordered by field
-    number"""
+    fields in the message which are not empty.  A message field is
+    non-empty if HasField() would return true. A singular primitive field
+    is non-empty if HasField() would return true in proto2 or it is non zero
+    in proto3. A repeated field is non-empty if it contains at least one
+    element.  The fields are ordered by field number"""
     raise NotImplementedError
 
   def HasField(self, field_name):
@@ -255,6 +268,9 @@
   def ClearExtension(self, extension_handle):
     raise NotImplementedError
 
+  def DiscardUnknownFields(self):
+    raise NotImplementedError
+
   def ByteSize(self):
     """Returns the serialized size of this message.
     Recursively calls ByteSize() on all contained messages.
diff --git a/python/google/protobuf/message_factory.py b/python/google/protobuf/message_factory.py
index 1b059d1..e4fb065 100644
--- a/python/google/protobuf/message_factory.py
+++ b/python/google/protobuf/message_factory.py
@@ -66,7 +66,7 @@
     Returns:
       A class describing the passed in descriptor.
     """
-    if descriptor.full_name not in self._classes:
+    if descriptor not in self._classes:
       descriptor_name = descriptor.name
       if str is bytes:  # PY2
         descriptor_name = descriptor.name.encode('ascii', 'ignore')
@@ -75,16 +75,16 @@
           (message.Message,),
           {'DESCRIPTOR': descriptor, '__module__': None})
           # If module not set, it wrongly points to the reflection.py module.
-      self._classes[descriptor.full_name] = result_class
+      self._classes[descriptor] = result_class
       for field in descriptor.fields:
         if field.message_type:
           self.GetPrototype(field.message_type)
       for extension in result_class.DESCRIPTOR.extensions:
-        if extension.containing_type.full_name not in self._classes:
+        if extension.containing_type not in self._classes:
           self.GetPrototype(extension.containing_type)
-        extended_class = self._classes[extension.containing_type.full_name]
+        extended_class = self._classes[extension.containing_type]
         extended_class.RegisterExtension(extension)
-    return self._classes[descriptor.full_name]
+    return self._classes[descriptor]
 
   def GetMessages(self, files):
     """Gets all the messages from a specified file.
@@ -103,13 +103,8 @@
     result = {}
     for file_name in files:
       file_desc = self.pool.FindFileByName(file_name)
-      for name, msg in file_desc.message_types_by_name.items():
-        if file_desc.package:
-          full_name = '.'.join([file_desc.package, name])
-        else:
-          full_name = msg.name
-        result[full_name] = self.GetPrototype(
-            self.pool.FindMessageTypeByName(full_name))
+      for desc in file_desc.message_types_by_name.values():
+        result[desc.full_name] = self.GetPrototype(desc)
 
       # While the extension FieldDescriptors are created by the descriptor pool,
       # the python classes created in the factory need them to be registered
@@ -120,10 +115,10 @@
       # ignore the registration if the original was the same, or raise
       # an error if they were different.
 
-      for name, extension in file_desc.extensions_by_name.items():
-        if extension.containing_type.full_name not in self._classes:
+      for extension in file_desc.extensions_by_name.values():
+        if extension.containing_type not in self._classes:
           self.GetPrototype(extension.containing_type)
-        extended_class = self._classes[extension.containing_type.full_name]
+        extended_class = self._classes[extension.containing_type]
         extended_class.RegisterExtension(extension)
     return result
 
@@ -135,13 +130,22 @@
   """Builds a dictionary of all the messages available in a set of files.
 
   Args:
-    file_protos: A sequence of file protos to build messages out of.
+    file_protos: Iterable of FileDescriptorProto to build messages out of.
 
   Returns:
     A dictionary mapping proto names to the message classes. This will include
     any dependent messages as well as any messages defined in the same file as
     a specified message.
   """
-  for file_proto in file_protos:
+  # The cpp implementation of the protocol buffer library requires to add the
+  # message in topological order of the dependency graph.
+  file_by_name = {file_proto.name: file_proto for file_proto in file_protos}
+  def _AddFile(file_proto):
+    for dependency in file_proto.dependency:
+      if dependency in file_by_name:
+        # Remove from elements to be visited, in order to cut cycles.
+        _AddFile(file_by_name.pop(dependency))
     _FACTORY.pool.Add(file_proto)
+  while file_by_name:
+    _AddFile(file_by_name.popitem()[1])
   return _FACTORY.GetMessages([file_proto.name for file_proto in file_protos])
diff --git a/python/google/protobuf/pyext/__init__.py b/python/google/protobuf/pyext/__init__.py
index e69de29..5585614 100644
--- a/python/google/protobuf/pyext/__init__.py
+++ b/python/google/protobuf/pyext/__init__.py
@@ -0,0 +1,4 @@
+try:
+  __import__('pkg_resources').declare_namespace(__name__)
+except ImportError:
+  __path__ = __import__('pkgutil').extend_path(__path__, __name__)
diff --git a/python/google/protobuf/pyext/cpp_message.py b/python/google/protobuf/pyext/cpp_message.py
index b215211..fc8eb32 100644
--- a/python/google/protobuf/pyext/cpp_message.py
+++ b/python/google/protobuf/pyext/cpp_message.py
@@ -48,9 +48,9 @@
   classes at runtime, as in this example:
 
   mydescriptor = Descriptor(.....)
-  class MyProtoClass(Message):
-    __metaclass__ = GeneratedProtocolMessageType
-    DESCRIPTOR = mydescriptor
+  factory = symbol_database.Default()
+  factory.pool.AddDescriptor(mydescriptor)
+  MyProtoClass = factory.GetPrototype(mydescriptor)
   myproto_instance = MyProtoClass()
   myproto.foo_field = 23
   ...
diff --git a/python/google/protobuf/pyext/descriptor.cc b/python/google/protobuf/pyext/descriptor.cc
index a875a7b..8af0cb1 100644
--- a/python/google/protobuf/pyext/descriptor.cc
+++ b/python/google/protobuf/pyext/descriptor.cc
@@ -32,6 +32,7 @@
 
 #include <Python.h>
 #include <frameobject.h>
+#include <google/protobuf/stubs/hash.h>
 #include <string>
 
 #include <google/protobuf/io/coded_stream.h>
@@ -41,6 +42,7 @@
 #include <google/protobuf/pyext/descriptor_containers.h>
 #include <google/protobuf/pyext/descriptor_pool.h>
 #include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/message_factory.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
 
 #if PY_MAJOR_VERSION >= 3
@@ -92,11 +94,10 @@
 // TODO(amauryfa): Change the proto2 compiler to remove the assignments, and
 // remove this hack.
 bool _CalledFromGeneratedFile(int stacklevel) {
-  PyThreadState *state = PyThreadState_GET();
-  if (state == NULL) {
-    return false;
-  }
-  PyFrameObject* frame = state->frame;
+#ifndef PYPY_VERSION
+  // This check is not critical and is somewhat difficult to implement correctly
+  // in PyPy.
+  PyFrameObject* frame = PyEval_GetFrame();
   if (frame == NULL) {
     return false;
   }
@@ -106,10 +107,6 @@
       return false;
     }
   }
-  if (frame->f_globals != frame->f_locals) {
-    // Not at global module scope
-    return false;
-  }
 
   if (frame->f_code->co_filename == NULL) {
     return false;
@@ -122,6 +119,10 @@
     PyErr_Clear();
     return false;
   }
+  if ((filename_size < 3) || (strcmp(&filename[filename_size - 3], ".py") != 0)) {
+    // Cython's stack does not have .py file name and is not at global module scope.
+    return true;
+  }
   if (filename_size < 7) {
     // filename is too short.
     return false;
@@ -130,6 +131,12 @@
     // Filename is not ending with _pb2.
     return false;
   }
+  
+  if (frame->f_globals != frame->f_locals) {
+    // Not at global module scope
+    return false;
+  }
+#endif
   return true;
 }
 
@@ -172,50 +179,56 @@
 const FileDescriptor* GetFileDescriptor(const OneofDescriptor* descriptor) {
   return descriptor->containing_type()->file();
 }
+template<>
+const FileDescriptor* GetFileDescriptor(const MethodDescriptor* descriptor) {
+  return descriptor->service()->file();
+}
 
 // Converts options into a Python protobuf, and cache the result.
 //
 // This is a bit tricky because options can contain extension fields defined in
 // the same proto file. In this case the options parsed from the serialized_pb
-// have unkown fields, and we need to parse them again.
+// have unknown fields, and we need to parse them again.
 //
 // Always returns a new reference.
 template<class DescriptorClass>
 static PyObject* GetOrBuildOptions(const DescriptorClass *descriptor) {
-  // Options (and their extensions) are completely resolved in the proto file
-  // containing the descriptor.
-  PyDescriptorPool* pool = GetDescriptorPool_FromPool(
-      GetFileDescriptor(descriptor)->pool());
-
-  hash_map<const void*, PyObject*>* descriptor_options =
-      pool->descriptor_options;
+  // Options are cached in the pool that owns the descriptor.
   // First search in the cache.
+  PyDescriptorPool* caching_pool = GetDescriptorPool_FromPool(
+      GetFileDescriptor(descriptor)->pool());
+  hash_map<const void*, PyObject*>* descriptor_options =
+      caching_pool->descriptor_options;
   if (descriptor_options->find(descriptor) != descriptor_options->end()) {
     PyObject *value = (*descriptor_options)[descriptor];
     Py_INCREF(value);
     return value;
   }
 
+  // Similar to the C++ implementation, we return an Options object from the
+  // default (generated) factory, so that client code know that they can use
+  // extensions from generated files:
+  //    d.GetOptions().Extensions[some_pb2.extension]
+  //
+  // The consequence is that extensions not defined in the default pool won't
+  // be available.  If needed, we could add an optional 'message_factory'
+  // parameter to the GetOptions() function.
+  PyMessageFactory* message_factory =
+      GetDefaultDescriptorPool()->py_message_factory;
+
   // Build the Options object: get its Python class, and make a copy of the C++
   // read-only instance.
   const Message& options(descriptor->options());
   const Descriptor *message_type = options.GetDescriptor();
-  PyObject* message_class(cdescriptor_pool::GetMessageClass(
-      pool, message_type));
-  if (message_class == NULL) {
-    // The Options message was not found in the current DescriptorPool.
-    // In this case, there cannot be extensions to these options, and we can
-    // try to use the basic pool instead.
-    PyErr_Clear();
-    message_class = cdescriptor_pool::GetMessageClass(
-      GetDefaultDescriptorPool(), message_type);
-  }
+  CMessageClass* message_class = message_factory::GetOrCreateMessageClass(
+      message_factory, message_type);
   if (message_class == NULL) {
     PyErr_Format(PyExc_TypeError, "Could not retrieve class for Options: %s",
                  message_type->full_name().c_str());
     return NULL;
   }
-  ScopedPyObjectPtr value(PyEval_CallObject(message_class, NULL));
+  ScopedPyObjectPtr value(
+      PyEval_CallObject(message_class->AsPyObject(), NULL));
   if (value == NULL) {
     return NULL;
   }
@@ -237,7 +250,8 @@
     options.SerializeToString(&serialized);
     io::CodedInputStream input(
         reinterpret_cast<const uint8*>(serialized.c_str()), serialized.size());
-    input.SetExtensionRegistry(pool->pool, pool->message_factory);
+    input.SetExtensionRegistry(message_factory->pool->pool,
+                               message_factory->message_factory);
     bool success = cmsg->message->MergePartialFromCodedStream(&input);
     if (!success) {
       PyErr_Format(PyExc_ValueError, "Error parsing Options message");
@@ -247,7 +261,7 @@
 
   // Cache the result.
   Py_INCREF(value.get());
-  (*pool->descriptor_options)[descriptor] = value.get();
+  (*descriptor_options)[descriptor] = value.get();
 
   return value.release();
 }
@@ -433,11 +447,12 @@
   // which contains this descriptor.
   // This might not be the one you expect! For example the returned object does
   // not know about extensions defined in a custom pool.
-  PyObject* concrete_class(cdescriptor_pool::GetMessageClass(
-      GetDescriptorPool_FromPool(_GetDescriptor(self)->file()->pool()),
+  CMessageClass* concrete_class(message_factory::GetMessageClass(
+      GetDescriptorPool_FromPool(
+          _GetDescriptor(self)->file()->pool())->py_message_factory,
       _GetDescriptor(self)));
   Py_XINCREF(concrete_class);
-  return concrete_class;
+  return concrete_class->AsPyObject();
 }
 
 static PyObject* GetFieldsByName(PyBaseDescriptor* self, void *closure) {
@@ -552,6 +567,11 @@
   return CheckCalledFromGeneratedFile("_options");
 }
 
+static int SetSerializedOptions(PyBaseDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
+
 static PyObject* CopyToProto(PyBaseDescriptor *self, PyObject *target) {
   return CopyToPythonProto<DescriptorProto>(_GetDescriptor(self), target);
 }
@@ -611,6 +631,8 @@
   { "is_extendable", (getter)IsExtendable, (setter)NULL},
   { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
   { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   { "syntax", (getter)GetSyntax, (setter)NULL, "Syntax"},
   {NULL}
 };
@@ -693,6 +715,14 @@
   return PyString_FromCppString(_GetDescriptor(self)->camelcase_name());
 }
 
+static PyObject* GetJsonName(PyBaseDescriptor* self, void *closure) {
+  return PyString_FromCppString(_GetDescriptor(self)->json_name());
+}
+
+static PyObject* GetFile(PyBaseDescriptor *self, void *closure) {
+  return PyFileDescriptor_FromDescriptor(_GetDescriptor(self)->file());
+}
+
 static PyObject* GetType(PyBaseDescriptor *self, void *closure) {
   return PyInt_FromLong(_GetDescriptor(self)->type());
 }
@@ -765,7 +795,7 @@
       break;
     }
     case FieldDescriptor::CPPTYPE_STRING: {
-      string value = _GetDescriptor(self)->default_value_string();
+      const string& value = _GetDescriptor(self)->default_value_string();
       result = ToStringObject(_GetDescriptor(self), value);
       break;
     }
@@ -877,11 +907,17 @@
   return CheckCalledFromGeneratedFile("_options");
 }
 
+static int SetSerializedOptions(PyBaseDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
 
 static PyGetSetDef Getters[] = {
   { "full_name", (getter)GetFullName, NULL, "Full name"},
   { "name", (getter)GetName, NULL, "Unqualified name"},
   { "camelcase_name", (getter)GetCamelcaseName, NULL, "Camelcase name"},
+  { "json_name", (getter)GetJsonName, NULL, "Json name"},
+  { "file", (getter)GetFile, NULL, "File Descriptor"},
   { "type", (getter)GetType, NULL, "C++ Type"},
   { "cpp_type", (getter)GetCppType, NULL, "C++ Type"},
   { "label", (getter)GetLabel, NULL, "Label"},
@@ -904,6 +940,8 @@
     "Containing oneof"},
   { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
   { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   {NULL}
 };
 
@@ -1033,6 +1071,11 @@
   return CheckCalledFromGeneratedFile("_options");
 }
 
+static int SetSerializedOptions(PyBaseDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
+
 static PyObject* CopyToProto(PyBaseDescriptor *self, PyObject *target) {
   return CopyToPythonProto<EnumDescriptorProto>(_GetDescriptor(self), target);
 }
@@ -1057,6 +1100,8 @@
     "Containing type"},
   { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
   { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   {NULL}
 };
 
@@ -1090,7 +1135,7 @@
   0,                                    // tp_weaklistoffset
   0,                                    // tp_iter
   0,                                    // tp_iternext
-  enum_descriptor::Methods,             // tp_getset
+  enum_descriptor::Methods,             // tp_methods
   0,                                    // tp_members
   enum_descriptor::Getters,             // tp_getset
   &descriptor::PyBaseDescriptor_Type,   // tp_base
@@ -1157,6 +1202,10 @@
   return CheckCalledFromGeneratedFile("_options");
 }
 
+static int SetSerializedOptions(PyBaseDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
 
 static PyGetSetDef Getters[] = {
   { "name", (getter)GetName, NULL, "name"},
@@ -1166,6 +1215,8 @@
 
   { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
   { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   {NULL}
 };
 
@@ -1274,6 +1325,10 @@
   return NewFileExtensionsByName(_GetDescriptor(self));
 }
 
+static PyObject* GetServicesByName(PyFileDescriptor* self, void *closure) {
+  return NewFileServicesByName(_GetDescriptor(self));
+}
+
 static PyObject* GetDependencies(PyFileDescriptor* self, void *closure) {
   return NewFileDependencies(_GetDescriptor(self));
 }
@@ -1304,6 +1359,11 @@
   return CheckCalledFromGeneratedFile("_options");
 }
 
+static int SetSerializedOptions(PyFileDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
+
 static PyObject* GetSyntax(PyFileDescriptor *self, void *closure) {
   return PyString_InternFromString(
       FileDescriptor::SyntaxName(_GetDescriptor(self)->syntax()));
@@ -1323,11 +1383,14 @@
   { "enum_types_by_name", (getter)GetEnumTypesByName, NULL, "Enums by name"},
   { "extensions_by_name", (getter)GetExtensionsByName, NULL,
     "Extensions by name"},
+  { "services_by_name", (getter)GetServicesByName, NULL, "Services by name"},
   { "dependencies", (getter)GetDependencies, NULL, "Dependencies"},
   { "public_dependencies", (getter)GetPublicDependencies, NULL, "Dependencies"},
 
   { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
   { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   { "syntax", (getter)GetSyntax, (setter)NULL, "Syntax"},
   {NULL}
 };
@@ -1451,16 +1514,52 @@
   }
 }
 
+static PyObject* GetHasOptions(PyBaseDescriptor *self, void *closure) {
+  const OneofOptions& options(_GetDescriptor(self)->options());
+  if (&options != &OneofOptions::default_instance()) {
+    Py_RETURN_TRUE;
+  } else {
+    Py_RETURN_FALSE;
+  }
+}
+static int SetHasOptions(PyBaseDescriptor *self, PyObject *value,
+                         void *closure) {
+  return CheckCalledFromGeneratedFile("has_options");
+}
+
+static PyObject* GetOptions(PyBaseDescriptor *self) {
+  return GetOrBuildOptions(_GetDescriptor(self));
+}
+
+static int SetOptions(PyBaseDescriptor *self, PyObject *value,
+                      void *closure) {
+  return CheckCalledFromGeneratedFile("_options");
+}
+
+static int SetSerializedOptions(PyBaseDescriptor *self, PyObject *value,
+                                void *closure) {
+  return CheckCalledFromGeneratedFile("_serialized_options");
+}
+
 static PyGetSetDef Getters[] = {
   { "name", (getter)GetName, NULL, "Name"},
   { "full_name", (getter)GetFullName, NULL, "Full name"},
   { "index", (getter)GetIndex, NULL, "Index"},
 
   { "containing_type", (getter)GetContainingType, NULL, "Containing type"},
+  { "has_options", (getter)GetHasOptions, (setter)SetHasOptions, "Has Options"},
+  { "_options", (getter)NULL, (setter)SetOptions, "Options"},
+  { "_serialized_options", (getter)NULL, (setter)SetSerializedOptions,
+    "Serialized Options"},
   { "fields", (getter)GetFields, NULL, "Fields"},
   {NULL}
 };
 
+static PyMethodDef Methods[] = {
+  { "GetOptions", (PyCFunction)GetOptions, METH_NOARGS },
+  {NULL}
+};
+
 }  // namespace oneof_descriptor
 
 PyTypeObject PyOneofDescriptor_Type = {
@@ -1491,7 +1590,7 @@
   0,                                    // tp_weaklistoffset
   0,                                    // tp_iter
   0,                                    // tp_iternext
-  0,                                    // tp_methods
+  oneof_descriptor::Methods,            // tp_methods
   0,                                    // tp_members
   oneof_descriptor::Getters,            // tp_getset
   &descriptor::PyBaseDescriptor_Type,   // tp_base
@@ -1503,6 +1602,245 @@
       &PyOneofDescriptor_Type, oneof_descriptor, NULL);
 }
 
+namespace service_descriptor {
+
+// Unchecked accessor to the C++ pointer.
+static const ServiceDescriptor* _GetDescriptor(
+    PyBaseDescriptor *self) {
+  return reinterpret_cast<const ServiceDescriptor*>(self->descriptor);
+}
+
+static PyObject* GetName(PyBaseDescriptor* self, void *closure) {
+  return PyString_FromCppString(_GetDescriptor(self)->name());
+}
+
+static PyObject* GetFullName(PyBaseDescriptor* self, void *closure) {
+  return PyString_FromCppString(_GetDescriptor(self)->full_name());
+}
+
+static PyObject* GetFile(PyBaseDescriptor *self, void *closure) {
+  return PyFileDescriptor_FromDescriptor(_GetDescriptor(self)->file());
+}
+
+static PyObject* GetIndex(PyBaseDescriptor *self, void *closure) {
+  return PyInt_FromLong(_GetDescriptor(self)->index());
+}
+
+static PyObject* GetMethods(PyBaseDescriptor* self, void *closure) {
+  return NewServiceMethodsSeq(_GetDescriptor(self));
+}
+
+static PyObject* GetMethodsByName(PyBaseDescriptor* self, void *closure) {
+  return NewServiceMethodsByName(_GetDescriptor(self));
+}
+
+static PyObject* FindMethodByName(PyBaseDescriptor *self, PyObject* arg) {
+  Py_ssize_t name_size;
+  char* name;
+  if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
+    return NULL;
+  }
+
+  const MethodDescriptor* method_descriptor =
+      _GetDescriptor(self)->FindMethodByName(string(name, name_size));
+  if (method_descriptor == NULL) {
+    PyErr_Format(PyExc_KeyError, "Couldn't find method %.200s", name);
+    return NULL;
+  }
+
+  return PyMethodDescriptor_FromDescriptor(method_descriptor);
+}
+
+static PyObject* GetOptions(PyBaseDescriptor *self) {
+  return GetOrBuildOptions(_GetDescriptor(self));
+}
+
+static PyObject* CopyToProto(PyBaseDescriptor *self, PyObject *target) {
+  return CopyToPythonProto<ServiceDescriptorProto>(_GetDescriptor(self),
+                                                   target);
+}
+
+static PyGetSetDef Getters[] = {
+  { "name", (getter)GetName, NULL, "Name", NULL},
+  { "full_name", (getter)GetFullName, NULL, "Full name", NULL},
+  { "file", (getter)GetFile, NULL, "File descriptor"},
+  { "index", (getter)GetIndex, NULL, "Index", NULL},
+
+  { "methods", (getter)GetMethods, NULL, "Methods", NULL},
+  { "methods_by_name", (getter)GetMethodsByName, NULL, "Methods by name", NULL},
+  {NULL}
+};
+
+static PyMethodDef Methods[] = {
+  { "GetOptions", (PyCFunction)GetOptions, METH_NOARGS },
+  { "CopyToProto", (PyCFunction)CopyToProto, METH_O, },
+  { "FindMethodByName", (PyCFunction)FindMethodByName, METH_O },
+  {NULL}
+};
+
+}  // namespace service_descriptor
+
+PyTypeObject PyServiceDescriptor_Type = {
+  PyVarObject_HEAD_INIT(&PyType_Type, 0)
+  FULL_MODULE_NAME ".ServiceDescriptor",  // tp_name
+  sizeof(PyBaseDescriptor),             // tp_basicsize
+  0,                                    // tp_itemsize
+  0,                                    // tp_dealloc
+  0,                                    // tp_print
+  0,                                    // tp_getattr
+  0,                                    // tp_setattr
+  0,                                    // tp_compare
+  0,                                    // tp_repr
+  0,                                    // tp_as_number
+  0,                                    // tp_as_sequence
+  0,                                    // tp_as_mapping
+  0,                                    // tp_hash
+  0,                                    // tp_call
+  0,                                    // tp_str
+  0,                                    // tp_getattro
+  0,                                    // tp_setattro
+  0,                                    // tp_as_buffer
+  Py_TPFLAGS_DEFAULT,                   // tp_flags
+  "A Service Descriptor",               // tp_doc
+  0,                                    // tp_traverse
+  0,                                    // tp_clear
+  0,                                    // tp_richcompare
+  0,                                    // tp_weaklistoffset
+  0,                                    // tp_iter
+  0,                                    // tp_iternext
+  service_descriptor::Methods,          // tp_methods
+  0,                                    // tp_members
+  service_descriptor::Getters,          // tp_getset
+  &descriptor::PyBaseDescriptor_Type,   // tp_base
+};
+
+PyObject* PyServiceDescriptor_FromDescriptor(
+    const ServiceDescriptor* service_descriptor) {
+  return descriptor::NewInternedDescriptor(
+      &PyServiceDescriptor_Type, service_descriptor, NULL);
+}
+
+const ServiceDescriptor* PyServiceDescriptor_AsDescriptor(PyObject* obj) {
+  if (!PyObject_TypeCheck(obj, &PyServiceDescriptor_Type)) {
+    PyErr_SetString(PyExc_TypeError, "Not a ServiceDescriptor");
+    return NULL;
+  }
+  return reinterpret_cast<const ServiceDescriptor*>(
+      reinterpret_cast<PyBaseDescriptor*>(obj)->descriptor);
+}
+
+namespace method_descriptor {
+
+// Unchecked accessor to the C++ pointer.
+static const MethodDescriptor* _GetDescriptor(
+    PyBaseDescriptor *self) {
+  return reinterpret_cast<const MethodDescriptor*>(self->descriptor);
+}
+
+static PyObject* GetName(PyBaseDescriptor* self, void *closure) {
+  return PyString_FromCppString(_GetDescriptor(self)->name());
+}
+
+static PyObject* GetFullName(PyBaseDescriptor* self, void *closure) {
+  return PyString_FromCppString(_GetDescriptor(self)->full_name());
+}
+
+static PyObject* GetIndex(PyBaseDescriptor *self, void *closure) {
+  return PyInt_FromLong(_GetDescriptor(self)->index());
+}
+
+static PyObject* GetContainingService(PyBaseDescriptor *self, void *closure) {
+  const ServiceDescriptor* containing_service =
+      _GetDescriptor(self)->service();
+  return PyServiceDescriptor_FromDescriptor(containing_service);
+}
+
+static PyObject* GetInputType(PyBaseDescriptor *self, void *closure) {
+  const Descriptor* input_type = _GetDescriptor(self)->input_type();
+  return PyMessageDescriptor_FromDescriptor(input_type);
+}
+
+static PyObject* GetOutputType(PyBaseDescriptor *self, void *closure) {
+  const Descriptor* output_type = _GetDescriptor(self)->output_type();
+  return PyMessageDescriptor_FromDescriptor(output_type);
+}
+
+static PyObject* GetOptions(PyBaseDescriptor *self) {
+  return GetOrBuildOptions(_GetDescriptor(self));
+}
+
+static PyObject* CopyToProto(PyBaseDescriptor *self, PyObject *target) {
+  return CopyToPythonProto<MethodDescriptorProto>(_GetDescriptor(self), target);
+}
+
+static PyGetSetDef Getters[] = {
+  { "name", (getter)GetName, NULL, "Name", NULL},
+  { "full_name", (getter)GetFullName, NULL, "Full name", NULL},
+  { "index", (getter)GetIndex, NULL, "Index", NULL},
+  { "containing_service", (getter)GetContainingService, NULL,
+    "Containing service", NULL},
+  { "input_type", (getter)GetInputType, NULL, "Input type", NULL},
+  { "output_type", (getter)GetOutputType, NULL, "Output type", NULL},
+  {NULL}
+};
+
+static PyMethodDef Methods[] = {
+  { "GetOptions", (PyCFunction)GetOptions, METH_NOARGS, },
+  { "CopyToProto", (PyCFunction)CopyToProto, METH_O, },
+  {NULL}
+};
+
+}  // namespace method_descriptor
+
+PyTypeObject PyMethodDescriptor_Type = {
+  PyVarObject_HEAD_INIT(&PyType_Type, 0)
+  FULL_MODULE_NAME ".MethodDescriptor",  // tp_name
+  sizeof(PyBaseDescriptor),             // tp_basicsize
+  0,                                    // tp_itemsize
+  0,                                    // tp_dealloc
+  0,                                    // tp_print
+  0,                                    // tp_getattr
+  0,                                    // tp_setattr
+  0,                                    // tp_compare
+  0,                                    // tp_repr
+  0,                                    // tp_as_number
+  0,                                    // tp_as_sequence
+  0,                                    // tp_as_mapping
+  0,                                    // tp_hash
+  0,                                    // tp_call
+  0,                                    // tp_str
+  0,                                    // tp_getattro
+  0,                                    // tp_setattro
+  0,                                    // tp_as_buffer
+  Py_TPFLAGS_DEFAULT,                   // tp_flags
+  "A Method Descriptor",                // tp_doc
+  0,                                    // tp_traverse
+  0,                                    // tp_clear
+  0,                                    // tp_richcompare
+  0,                                    // tp_weaklistoffset
+  0,                                    // tp_iter
+  0,                                    // tp_iternext
+  method_descriptor::Methods,           // tp_methods
+  0,                                    // tp_members
+  method_descriptor::Getters,           // tp_getset
+  &descriptor::PyBaseDescriptor_Type,   // tp_base
+};
+
+PyObject* PyMethodDescriptor_FromDescriptor(
+    const MethodDescriptor* method_descriptor) {
+  return descriptor::NewInternedDescriptor(
+      &PyMethodDescriptor_Type, method_descriptor, NULL);
+}
+
+const MethodDescriptor* PyMethodDescriptor_AsDescriptor(PyObject* obj) {
+  if (!PyObject_TypeCheck(obj, &PyMethodDescriptor_Type)) {
+    PyErr_SetString(PyExc_TypeError, "Not a MethodDescriptor");
+    return NULL;
+  }
+  return reinterpret_cast<const MethodDescriptor*>(
+      reinterpret_cast<PyBaseDescriptor*>(obj)->descriptor);
+}
+
 // Add a enum values to a type dictionary.
 static bool AddEnumValues(PyTypeObject *type,
                           const EnumDescriptor* enum_descriptor) {
@@ -1572,6 +1910,12 @@
   if (PyType_Ready(&PyOneofDescriptor_Type) < 0)
     return false;
 
+  if (PyType_Ready(&PyServiceDescriptor_Type) < 0)
+    return false;
+
+  if (PyType_Ready(&PyMethodDescriptor_Type) < 0)
+    return false;
+
   if (!InitDescriptorMappingTypes())
     return false;
 
diff --git a/python/google/protobuf/pyext/descriptor.h b/python/google/protobuf/pyext/descriptor.h
index eb99df1..f081df8 100644
--- a/python/google/protobuf/pyext/descriptor.h
+++ b/python/google/protobuf/pyext/descriptor.h
@@ -47,6 +47,8 @@
 extern PyTypeObject PyEnumValueDescriptor_Type;
 extern PyTypeObject PyFileDescriptor_Type;
 extern PyTypeObject PyOneofDescriptor_Type;
+extern PyTypeObject PyServiceDescriptor_Type;
+extern PyTypeObject PyMethodDescriptor_Type;
 
 // Wraps a Descriptor in a Python object.
 // The C++ pointer is usually borrowed from the global DescriptorPool.
@@ -60,6 +62,10 @@
 PyObject* PyOneofDescriptor_FromDescriptor(const OneofDescriptor* descriptor);
 PyObject* PyFileDescriptor_FromDescriptor(
     const FileDescriptor* file_descriptor);
+PyObject* PyServiceDescriptor_FromDescriptor(
+    const ServiceDescriptor* descriptor);
+PyObject* PyMethodDescriptor_FromDescriptor(
+    const MethodDescriptor* descriptor);
 
 // Alternate constructor of PyFileDescriptor, used when we already have a
 // serialized FileDescriptorProto that can be cached.
@@ -74,6 +80,8 @@
 const FieldDescriptor* PyFieldDescriptor_AsDescriptor(PyObject* obj);
 const EnumDescriptor* PyEnumDescriptor_AsDescriptor(PyObject* obj);
 const FileDescriptor* PyFileDescriptor_AsDescriptor(PyObject* obj);
+const ServiceDescriptor* PyServiceDescriptor_AsDescriptor(PyObject* obj);
+const MethodDescriptor* PyMethodDescriptor_AsDescriptor(PyObject* obj);
 
 // Returns the raw C++ pointer.
 const void* PyDescriptor_AsVoidPtr(PyObject* obj);
diff --git a/python/google/protobuf/pyext/descriptor_containers.cc b/python/google/protobuf/pyext/descriptor_containers.cc
index e505d81..bc007f7 100644
--- a/python/google/protobuf/pyext/descriptor_containers.cc
+++ b/python/google/protobuf/pyext/descriptor_containers.cc
@@ -608,6 +608,24 @@
   return _NewObj_ByIndex(self, index);
 }
 
+static PyObject *
+SeqSubscript(PyContainer* self, PyObject* item) {
+  if (PyIndex_Check(item)) {
+      Py_ssize_t index;
+      index = PyNumber_AsSsize_t(item, PyExc_IndexError);
+      if (index == -1 && PyErr_Occurred())
+          return NULL;
+      return GetItem(self, index);
+  }
+  // Materialize the list and delegate the operation to it.
+  ScopedPyObjectPtr list(PyObject_CallFunctionObjArgs(
+      reinterpret_cast<PyObject*>(&PyList_Type), self, NULL));
+  if (list == NULL) {
+    return NULL;
+  }
+  return Py_TYPE(list.get())->tp_as_mapping->mp_subscript(list.get(), item);
+}
+
 // Returns the position of the item in the sequence, of -1 if not found.
 // This function never fails.
 int Find(PyContainer* self, PyObject* item) {
@@ -703,14 +721,20 @@
 };
 
 static PySequenceMethods SeqSequenceMethods = {
-    (lenfunc)Length,          // sq_length
-    0,                        // sq_concat
-    0,                        // sq_repeat
-    (ssizeargfunc)GetItem,    // sq_item
-    0,                        // sq_slice
-    0,                        // sq_ass_item
-    0,                        // sq_ass_slice
-    (objobjproc)SeqContains,  // sq_contains
+  (lenfunc)Length,          // sq_length
+  0,                        // sq_concat
+  0,                        // sq_repeat
+  (ssizeargfunc)GetItem,    // sq_item
+  0,                        // sq_slice
+  0,                        // sq_ass_item
+  0,                        // sq_ass_slice
+  (objobjproc)SeqContains,  // sq_contains
+};
+
+static PyMappingMethods SeqMappingMethods = {
+  (lenfunc)Length,           // mp_length
+  (binaryfunc)SeqSubscript,  // mp_subscript
+  0,                         // mp_ass_subscript
 };
 
 PyTypeObject DescriptorSequence_Type = {
@@ -726,7 +750,7 @@
   (reprfunc)ContainerRepr,              // tp_repr
   0,                                    // tp_as_number
   &SeqSequenceMethods,                  // tp_as_sequence
-  0,                                    // tp_as_mapping
+  &SeqMappingMethods,                   // tp_as_mapping
   0,                                    // tp_hash
   0,                                    // tp_call
   0,                                    // tp_str
@@ -933,55 +957,55 @@
   return GetDescriptor(self)->field_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindFieldByName(name);
 }
 
-static ItemDescriptor GetByCamelcaseName(PyContainer* self,
+static const void* GetByCamelcaseName(PyContainer* self,
                                          const string& name) {
   return GetDescriptor(self)->FindFieldByCamelcaseName(name);
 }
 
-static ItemDescriptor GetByNumber(PyContainer* self, int number) {
+static const void* GetByNumber(PyContainer* self, int number) {
   return GetDescriptor(self)->FindFieldByNumber(number);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->field(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFieldDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFieldDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static const string& GetItemCamelcaseName(ItemDescriptor item) {
-  return item->camelcase_name();
+static const string& GetItemCamelcaseName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->camelcase_name();
 }
 
-static int GetItemNumber(ItemDescriptor item) {
-  return item->number();
+static int GetItemNumber(const void* item) {
+  return static_cast<ItemDescriptor>(item)->number();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageFields",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)GetByCamelcaseName,
-  (GetByNumberMethod)GetByNumber,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)GetItemCamelcaseName,
-  (GetItemNumberMethod)GetItemNumber,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  GetByCamelcaseName,
+  GetByNumber,
+  NewObjectFromItem,
+  GetItemName,
+  GetItemCamelcaseName,
+  GetItemNumber,
+  GetItemIndex,
 };
 
 }  // namespace fields
@@ -1011,38 +1035,38 @@
   return GetDescriptor(self)->nested_type_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindNestedTypeByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->nested_type(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyMessageDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyMessageDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageNestedTypes",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace nested_types
@@ -1063,38 +1087,38 @@
   return GetDescriptor(self)->enum_type_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindEnumTypeByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->enum_type(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyEnumDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyEnumDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageNestedEnums",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace enums
@@ -1126,11 +1150,11 @@
   return count;
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindEnumValueByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   // This is not optimal, but the number of enums *types* in a given message
   // is small.  This function is only used when iterating over the mapping.
   const EnumDescriptor* enum_type = NULL;
@@ -1149,26 +1173,27 @@
   return enum_type->value(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyEnumValueDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyEnumValueDescriptor_FromDescriptor(
+      static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageEnumValues",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)NULL,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  NULL,
 };
 
 }  // namespace enumvalues
@@ -1185,38 +1210,38 @@
   return GetDescriptor(self)->extension_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindExtensionByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->extension(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFieldDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFieldDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageExtensions",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace extensions
@@ -1237,38 +1262,38 @@
   return GetDescriptor(self)->oneof_decl_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindOneofByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->oneof_decl(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyOneofDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyOneofDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "MessageOneofs",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace oneofs
@@ -1299,46 +1324,47 @@
   return GetDescriptor(self)->value_count();
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->value(index);
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindValueByName(name);
 }
 
-static ItemDescriptor GetByNumber(PyContainer* self, int number) {
+static const void* GetByNumber(PyContainer* self, int number) {
   return GetDescriptor(self)->FindValueByNumber(number);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyEnumValueDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyEnumValueDescriptor_FromDescriptor(
+      static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemNumber(ItemDescriptor item) {
-  return item->number();
+static int GetItemNumber(const void* item) {
+  return static_cast<ItemDescriptor>(item)->number();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "EnumValues",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)GetByNumber,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)GetItemNumber,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  GetByNumber,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  GetItemNumber,
+  GetItemIndex,
 };
 
 }  // namespace enumvalues
@@ -1373,30 +1399,30 @@
   return GetDescriptor(self)->field_count();
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->field(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFieldDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFieldDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index_in_oneof();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index_in_oneof();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "OneofFields",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)NULL,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)NULL,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  NULL,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  NULL,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace fields
@@ -1407,6 +1433,68 @@
 
 }  // namespace oneof_descriptor
 
+namespace service_descriptor {
+
+typedef const ServiceDescriptor* ParentDescriptor;
+
+static ParentDescriptor GetDescriptor(PyContainer* self) {
+  return reinterpret_cast<ParentDescriptor>(self->descriptor);
+}
+
+namespace methods {
+
+typedef const MethodDescriptor* ItemDescriptor;
+
+static int Count(PyContainer* self) {
+  return GetDescriptor(self)->method_count();
+}
+
+static const void* GetByName(PyContainer* self, const string& name) {
+  return GetDescriptor(self)->FindMethodByName(name);
+}
+
+static const void* GetByIndex(PyContainer* self, int index) {
+  return GetDescriptor(self)->method(index);
+}
+
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyMethodDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
+}
+
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
+}
+
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
+}
+
+static DescriptorContainerDef ContainerDef = {
+  "ServiceMethods",
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
+};
+
+}  // namespace methods
+
+PyObject* NewServiceMethodsSeq(ParentDescriptor descriptor) {
+  return descriptor::NewSequence(&methods::ContainerDef, descriptor);
+}
+
+PyObject* NewServiceMethodsByName(ParentDescriptor descriptor) {
+  return descriptor::NewMappingByName(&methods::ContainerDef, descriptor);
+}
+
+}  // namespace service_descriptor
+
 namespace file_descriptor {
 
 typedef const FileDescriptor* ParentDescriptor;
@@ -1423,43 +1511,43 @@
   return GetDescriptor(self)->message_type_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindMessageTypeByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->message_type(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyMessageDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyMessageDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "FileMessages",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace messages
 
-PyObject* NewFileMessageTypesByName(const FileDescriptor* descriptor) {
+PyObject* NewFileMessageTypesByName(ParentDescriptor descriptor) {
   return descriptor::NewMappingByName(&messages::ContainerDef, descriptor);
 }
 
@@ -1471,43 +1559,43 @@
   return GetDescriptor(self)->enum_type_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindEnumTypeByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->enum_type(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyEnumDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyEnumDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "FileEnums",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace enums
 
-PyObject* NewFileEnumTypesByName(const FileDescriptor* descriptor) {
+PyObject* NewFileEnumTypesByName(ParentDescriptor descriptor) {
   return descriptor::NewMappingByName(&enums::ContainerDef, descriptor);
 }
 
@@ -1519,46 +1607,94 @@
   return GetDescriptor(self)->extension_count();
 }
 
-static ItemDescriptor GetByName(PyContainer* self, const string& name) {
+static const void* GetByName(PyContainer* self, const string& name) {
   return GetDescriptor(self)->FindExtensionByName(name);
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->extension(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFieldDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFieldDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
-static const string& GetItemName(ItemDescriptor item) {
-  return item->name();
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
 }
 
-static int GetItemIndex(ItemDescriptor item) {
-  return item->index();
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
 }
 
 static DescriptorContainerDef ContainerDef = {
   "FileExtensions",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)GetByName,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)GetItemName,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)GetItemIndex,
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
 };
 
 }  // namespace extensions
 
-PyObject* NewFileExtensionsByName(const FileDescriptor* descriptor) {
+PyObject* NewFileExtensionsByName(ParentDescriptor descriptor) {
   return descriptor::NewMappingByName(&extensions::ContainerDef, descriptor);
 }
 
+namespace services {
+
+typedef const ServiceDescriptor* ItemDescriptor;
+
+static int Count(PyContainer* self) {
+  return GetDescriptor(self)->service_count();
+}
+
+static const void* GetByName(PyContainer* self, const string& name) {
+  return GetDescriptor(self)->FindServiceByName(name);
+}
+
+static const void* GetByIndex(PyContainer* self, int index) {
+  return GetDescriptor(self)->service(index);
+}
+
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyServiceDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
+}
+
+static const string& GetItemName(const void* item) {
+  return static_cast<ItemDescriptor>(item)->name();
+}
+
+static int GetItemIndex(const void* item) {
+  return static_cast<ItemDescriptor>(item)->index();
+}
+
+static DescriptorContainerDef ContainerDef = {
+  "FileServices",
+  Count,
+  GetByIndex,
+  GetByName,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  GetItemName,
+  NULL,
+  NULL,
+  GetItemIndex,
+};
+
+}  // namespace services
+
+PyObject* NewFileServicesByName(const FileDescriptor* descriptor) {
+  return descriptor::NewMappingByName(&services::ContainerDef, descriptor);
+}
+
 namespace dependencies {
 
 typedef const FileDescriptor* ItemDescriptor;
@@ -1567,26 +1703,26 @@
   return GetDescriptor(self)->dependency_count();
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->dependency(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFileDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFileDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
 static DescriptorContainerDef ContainerDef = {
   "FileDependencies",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)NULL,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)NULL,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)NULL,
+  Count,
+  GetByIndex,
+  NULL,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  NULL,
+  NULL,
+  NULL,
+  NULL,
 };
 
 }  // namespace dependencies
@@ -1603,26 +1739,26 @@
   return GetDescriptor(self)->public_dependency_count();
 }
 
-static ItemDescriptor GetByIndex(PyContainer* self, int index) {
+static const void* GetByIndex(PyContainer* self, int index) {
   return GetDescriptor(self)->public_dependency(index);
 }
 
-static PyObject* NewObjectFromItem(ItemDescriptor item) {
-  return PyFileDescriptor_FromDescriptor(item);
+static PyObject* NewObjectFromItem(const void* item) {
+  return PyFileDescriptor_FromDescriptor(static_cast<ItemDescriptor>(item));
 }
 
 static DescriptorContainerDef ContainerDef = {
   "FilePublicDependencies",
-  (CountMethod)Count,
-  (GetByIndexMethod)GetByIndex,
-  (GetByNameMethod)NULL,
-  (GetByCamelcaseNameMethod)NULL,
-  (GetByNumberMethod)NULL,
-  (NewObjectFromItemMethod)NewObjectFromItem,
-  (GetItemNameMethod)NULL,
-  (GetItemCamelcaseNameMethod)NULL,
-  (GetItemNumberMethod)NULL,
-  (GetItemIndexMethod)NULL,
+  Count,
+  GetByIndex,
+  NULL,
+  NULL,
+  NULL,
+  NewObjectFromItem,
+  NULL,
+  NULL,
+  NULL,
+  NULL,
 };
 
 }  // namespace public_dependencies
diff --git a/python/google/protobuf/pyext/descriptor_containers.h b/python/google/protobuf/pyext/descriptor_containers.h
index ce40747..83de07b 100644
--- a/python/google/protobuf/pyext/descriptor_containers.h
+++ b/python/google/protobuf/pyext/descriptor_containers.h
@@ -43,6 +43,7 @@
 class FileDescriptor;
 class EnumDescriptor;
 class OneofDescriptor;
+class ServiceDescriptor;
 
 namespace python {
 
@@ -89,10 +90,17 @@
 
 PyObject* NewFileExtensionsByName(const FileDescriptor* descriptor);
 
+PyObject* NewFileServicesByName(const FileDescriptor* descriptor);
+
 PyObject* NewFileDependencies(const FileDescriptor* descriptor);
 PyObject* NewFilePublicDependencies(const FileDescriptor* descriptor);
 }  // namespace file_descriptor
 
+namespace service_descriptor {
+PyObject* NewServiceMethodsSeq(const ServiceDescriptor* descriptor);
+PyObject* NewServiceMethodsByName(const ServiceDescriptor* descriptor);
+}  // namespace service_descriptor
+
 
 }  // namespace python
 }  // namespace protobuf
diff --git a/python/google/protobuf/pyext/descriptor_database.cc b/python/google/protobuf/pyext/descriptor_database.cc
index 514722b..daa40cc 100644
--- a/python/google/protobuf/pyext/descriptor_database.cc
+++ b/python/google/protobuf/pyext/descriptor_database.cc
@@ -64,6 +64,9 @@
     }
     return false;
   }
+  if (py_descriptor == Py_None) {
+    return false;
+  }
   const Descriptor* filedescriptor_descriptor =
       FileDescriptorProto::default_instance().GetDescriptor();
   CMessage* message = reinterpret_cast<CMessage*>(py_descriptor);
diff --git a/python/google/protobuf/pyext/descriptor_pool.cc b/python/google/protobuf/pyext/descriptor_pool.cc
index 0bc76bc..95882ae 100644
--- a/python/google/protobuf/pyext/descriptor_pool.cc
+++ b/python/google/protobuf/pyext/descriptor_pool.cc
@@ -33,12 +33,13 @@
 #include <Python.h>
 
 #include <google/protobuf/descriptor.pb.h>
-#include <google/protobuf/dynamic_message.h>
 #include <google/protobuf/pyext/descriptor.h>
 #include <google/protobuf/pyext/descriptor_database.h>
 #include <google/protobuf/pyext/descriptor_pool.h>
 #include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/message_factory.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
+#include <google/protobuf/stubs/hash.h>
 
 #if PY_MAJOR_VERSION >= 3
   #define PyString_FromStringAndSize PyUnicode_FromStringAndSize
@@ -73,18 +74,16 @@
   cpool->underlay = NULL;
   cpool->database = NULL;
 
-  DynamicMessageFactory* message_factory = new DynamicMessageFactory();
-  // This option might be the default some day.
-  message_factory->SetDelegateToGeneratedFactory(true);
-  cpool->message_factory = message_factory;
-
-  // TODO(amauryfa): Rewrite the SymbolDatabase in C so that it uses the same
-  // storage.
-  cpool->classes_by_descriptor =
-      new PyDescriptorPool::ClassesByMessageMap();
   cpool->descriptor_options =
       new hash_map<const void*, PyObject *>();
 
+  cpool->py_message_factory = message_factory::NewMessageFactory(
+      &PyMessageFactory_Type, cpool);
+  if (cpool->py_message_factory == NULL) {
+    Py_DECREF(cpool);
+    return NULL;
+  }
+
   return cpool;
 }
 
@@ -150,27 +149,22 @@
       PyDescriptorPool_NewWithDatabase(database));
 }
 
-static void Dealloc(PyDescriptorPool* self) {
-  typedef PyDescriptorPool::ClassesByMessageMap::iterator iterator;
+static void Dealloc(PyObject* pself) {
+  PyDescriptorPool* self = reinterpret_cast<PyDescriptorPool*>(pself);
   descriptor_pool_map.erase(self->pool);
-  for (iterator it = self->classes_by_descriptor->begin();
-       it != self->classes_by_descriptor->end(); ++it) {
-    Py_DECREF(it->second);
-  }
-  delete self->classes_by_descriptor;
+  Py_CLEAR(self->py_message_factory);
   for (hash_map<const void*, PyObject*>::iterator it =
            self->descriptor_options->begin();
        it != self->descriptor_options->end(); ++it) {
     Py_DECREF(it->second);
   }
   delete self->descriptor_options;
-  delete self->message_factory;
   delete self->database;
   delete self->pool;
   Py_TYPE(self)->tp_free(reinterpret_cast<PyObject*>(self));
 }
 
-PyObject* FindMessageByName(PyDescriptorPool* self, PyObject* arg) {
+static PyObject* FindMessageByName(PyObject* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
   if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
@@ -178,7 +172,8 @@
   }
 
   const Descriptor* message_descriptor =
-      self->pool->FindMessageTypeByName(string(name, name_size));
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindMessageTypeByName(
+          string(name, name_size));
 
   if (message_descriptor == NULL) {
     PyErr_Format(PyExc_KeyError, "Couldn't find message %.200s", name);
@@ -188,37 +183,10 @@
   return PyMessageDescriptor_FromDescriptor(message_descriptor);
 }
 
-// Add a message class to our database.
-int RegisterMessageClass(PyDescriptorPool* self,
-                         const Descriptor *message_descriptor,
-                         PyObject *message_class) {
-  Py_INCREF(message_class);
-  typedef PyDescriptorPool::ClassesByMessageMap::iterator iterator;
-  std::pair<iterator, bool> ret = self->classes_by_descriptor->insert(
-      std::make_pair(message_descriptor, message_class));
-  if (!ret.second) {
-    // Update case: DECREF the previous value.
-    Py_DECREF(ret.first->second);
-    ret.first->second = message_class;
-  }
-  return 0;
-}
 
-// Retrieve the message class added to our database.
-PyObject *GetMessageClass(PyDescriptorPool* self,
-                          const Descriptor *message_descriptor) {
-  typedef PyDescriptorPool::ClassesByMessageMap::iterator iterator;
-  iterator ret = self->classes_by_descriptor->find(message_descriptor);
-  if (ret == self->classes_by_descriptor->end()) {
-    PyErr_Format(PyExc_TypeError, "No message class registered for '%s'",
-                 message_descriptor->full_name().c_str());
-    return NULL;
-  } else {
-    return ret->second;
-  }
-}
 
-PyObject* FindFileByName(PyDescriptorPool* self, PyObject* arg) {
+
+static PyObject* FindFileByName(PyObject* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
   if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
@@ -226,13 +194,12 @@
   }
 
   const FileDescriptor* file_descriptor =
-      self->pool->FindFileByName(string(name, name_size));
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindFileByName(
+          string(name, name_size));
   if (file_descriptor == NULL) {
-    PyErr_Format(PyExc_KeyError, "Couldn't find file %.200s",
-                 name);
+    PyErr_Format(PyExc_KeyError, "Couldn't find file %.200s", name);
     return NULL;
   }
-
   return PyFileDescriptor_FromDescriptor(file_descriptor);
 }
 
@@ -254,6 +221,10 @@
   return PyFieldDescriptor_FromDescriptor(field_descriptor);
 }
 
+static PyObject* FindFieldByNameMethod(PyObject* self, PyObject* arg) {
+  return FindFieldByName(reinterpret_cast<PyDescriptorPool*>(self), arg);
+}
+
 PyObject* FindExtensionByName(PyDescriptorPool* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
@@ -271,6 +242,10 @@
   return PyFieldDescriptor_FromDescriptor(field_descriptor);
 }
 
+static PyObject* FindExtensionByNameMethod(PyObject* self, PyObject* arg) {
+  return FindExtensionByName(reinterpret_cast<PyDescriptorPool*>(self), arg);
+}
+
 PyObject* FindEnumTypeByName(PyDescriptorPool* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
@@ -288,6 +263,10 @@
   return PyEnumDescriptor_FromDescriptor(enum_descriptor);
 }
 
+static PyObject* FindEnumTypeByNameMethod(PyObject* self, PyObject* arg) {
+  return FindEnumTypeByName(reinterpret_cast<PyDescriptorPool*>(self), arg);
+}
+
 PyObject* FindOneofByName(PyDescriptorPool* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
@@ -305,7 +284,47 @@
   return PyOneofDescriptor_FromDescriptor(oneof_descriptor);
 }
 
-PyObject* FindFileContainingSymbol(PyDescriptorPool* self, PyObject* arg) {
+static PyObject* FindOneofByNameMethod(PyObject* self, PyObject* arg) {
+  return FindOneofByName(reinterpret_cast<PyDescriptorPool*>(self), arg);
+}
+
+static PyObject* FindServiceByName(PyObject* self, PyObject* arg) {
+  Py_ssize_t name_size;
+  char* name;
+  if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
+    return NULL;
+  }
+
+  const ServiceDescriptor* service_descriptor =
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindServiceByName(
+          string(name, name_size));
+  if (service_descriptor == NULL) {
+    PyErr_Format(PyExc_KeyError, "Couldn't find service %.200s", name);
+    return NULL;
+  }
+
+  return PyServiceDescriptor_FromDescriptor(service_descriptor);
+}
+
+static PyObject* FindMethodByName(PyObject* self, PyObject* arg) {
+  Py_ssize_t name_size;
+  char* name;
+  if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
+    return NULL;
+  }
+
+  const MethodDescriptor* method_descriptor =
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindMethodByName(
+          string(name, name_size));
+  if (method_descriptor == NULL) {
+    PyErr_Format(PyExc_KeyError, "Couldn't find method %.200s", name);
+    return NULL;
+  }
+
+  return PyMethodDescriptor_FromDescriptor(method_descriptor);
+}
+
+static PyObject* FindFileContainingSymbol(PyObject* self, PyObject* arg) {
   Py_ssize_t name_size;
   char* name;
   if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
@@ -313,7 +332,8 @@
   }
 
   const FileDescriptor* file_descriptor =
-      self->pool->FindFileContainingSymbol(string(name, name_size));
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindFileContainingSymbol(
+          string(name, name_size));
   if (file_descriptor == NULL) {
     PyErr_Format(PyExc_KeyError, "Couldn't find symbol %.200s", name);
     return NULL;
@@ -322,6 +342,53 @@
   return PyFileDescriptor_FromDescriptor(file_descriptor);
 }
 
+static PyObject* FindExtensionByNumber(PyObject* self, PyObject* args) {
+  PyObject* message_descriptor;
+  int number;
+  if (!PyArg_ParseTuple(args, "Oi", &message_descriptor, &number)) {
+    return NULL;
+  }
+  const Descriptor* descriptor = PyMessageDescriptor_AsDescriptor(
+      message_descriptor);
+  if (descriptor == NULL) {
+    return NULL;
+  }
+
+  const FieldDescriptor* extension_descriptor =
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindExtensionByNumber(
+          descriptor, number);
+  if (extension_descriptor == NULL) {
+    PyErr_Format(PyExc_KeyError, "Couldn't find extension %d", number);
+    return NULL;
+  }
+
+  return PyFieldDescriptor_FromDescriptor(extension_descriptor);
+}
+
+static PyObject* FindAllExtensions(PyObject* self, PyObject* arg) {
+  const Descriptor* descriptor = PyMessageDescriptor_AsDescriptor(arg);
+  if (descriptor == NULL) {
+    return NULL;
+  }
+
+  std::vector<const FieldDescriptor*> extensions;
+  reinterpret_cast<PyDescriptorPool*>(self)->pool->FindAllExtensions(
+      descriptor, &extensions);
+
+  ScopedPyObjectPtr result(PyList_New(extensions.size()));
+  if (result == NULL) {
+    return NULL;
+  }
+  for (int i = 0; i < extensions.size(); i++) {
+    PyObject* extension = PyFieldDescriptor_FromDescriptor(extensions[i]);
+    if (extension == NULL) {
+      return NULL;
+    }
+    PyList_SET_ITEM(result.get(), i, extension);  // Steals the reference.
+  }
+  return result.release();
+}
+
 // These functions should not exist -- the only valid way to create
 // descriptors is to call Add() or AddSerializedFile().
 // But these AddDescriptor() functions were created in Python and some people
@@ -331,14 +398,15 @@
 // call a function that will just be a no-op?
 // TODO(amauryfa): Need to investigate further.
 
-PyObject* AddFileDescriptor(PyDescriptorPool* self, PyObject* descriptor) {
+static PyObject* AddFileDescriptor(PyObject* self, PyObject* descriptor) {
   const FileDescriptor* file_descriptor =
       PyFileDescriptor_AsDescriptor(descriptor);
   if (!file_descriptor) {
     return NULL;
   }
   if (file_descriptor !=
-      self->pool->FindFileByName(file_descriptor->name())) {
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindFileByName(
+          file_descriptor->name())) {
     PyErr_Format(PyExc_ValueError,
                  "The file descriptor %s does not belong to this pool",
                  file_descriptor->name().c_str());
@@ -347,14 +415,15 @@
   Py_RETURN_NONE;
 }
 
-PyObject* AddDescriptor(PyDescriptorPool* self, PyObject* descriptor) {
+static PyObject* AddDescriptor(PyObject* self, PyObject* descriptor) {
   const Descriptor* message_descriptor =
       PyMessageDescriptor_AsDescriptor(descriptor);
   if (!message_descriptor) {
     return NULL;
   }
   if (message_descriptor !=
-      self->pool->FindMessageTypeByName(message_descriptor->full_name())) {
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindMessageTypeByName(
+          message_descriptor->full_name())) {
     PyErr_Format(PyExc_ValueError,
                  "The message descriptor %s does not belong to this pool",
                  message_descriptor->full_name().c_str());
@@ -363,14 +432,15 @@
   Py_RETURN_NONE;
 }
 
-PyObject* AddEnumDescriptor(PyDescriptorPool* self, PyObject* descriptor) {
+static PyObject* AddEnumDescriptor(PyObject* self, PyObject* descriptor) {
   const EnumDescriptor* enum_descriptor =
       PyEnumDescriptor_AsDescriptor(descriptor);
   if (!enum_descriptor) {
     return NULL;
   }
   if (enum_descriptor !=
-      self->pool->FindEnumTypeByName(enum_descriptor->full_name())) {
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindEnumTypeByName(
+          enum_descriptor->full_name())) {
     PyErr_Format(PyExc_ValueError,
                  "The enum descriptor %s does not belong to this pool",
                  enum_descriptor->full_name().c_str());
@@ -379,8 +449,41 @@
   Py_RETURN_NONE;
 }
 
-// The code below loads new Descriptors from a serialized FileDescriptorProto.
+static PyObject* AddExtensionDescriptor(PyObject* self, PyObject* descriptor) {
+  const FieldDescriptor* extension_descriptor =
+      PyFieldDescriptor_AsDescriptor(descriptor);
+  if (!extension_descriptor) {
+    return NULL;
+  }
+  if (extension_descriptor !=
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindExtensionByName(
+          extension_descriptor->full_name())) {
+    PyErr_Format(PyExc_ValueError,
+                 "The extension descriptor %s does not belong to this pool",
+                 extension_descriptor->full_name().c_str());
+    return NULL;
+  }
+  Py_RETURN_NONE;
+}
 
+static PyObject* AddServiceDescriptor(PyObject* self, PyObject* descriptor) {
+  const ServiceDescriptor* service_descriptor =
+      PyServiceDescriptor_AsDescriptor(descriptor);
+  if (!service_descriptor) {
+    return NULL;
+  }
+  if (service_descriptor !=
+      reinterpret_cast<PyDescriptorPool*>(self)->pool->FindServiceByName(
+          service_descriptor->full_name())) {
+    PyErr_Format(PyExc_ValueError,
+                 "The service descriptor %s does not belong to this pool",
+                 service_descriptor->full_name().c_str());
+    return NULL;
+  }
+  Py_RETURN_NONE;
+}
+
+// The code below loads new Descriptors from a serialized FileDescriptorProto.
 
 // Collects errors that occur during proto file building to allow them to be
 // propagated in the python exception instead of only living in ERROR logs.
@@ -407,7 +510,8 @@
   bool had_errors;
 };
 
-PyObject* AddSerializedFile(PyDescriptorPool* self, PyObject* serialized_pb) {
+static PyObject* AddSerializedFile(PyObject* pself, PyObject* serialized_pb) {
+  PyDescriptorPool* self = reinterpret_cast<PyDescriptorPool*>(pself);
   char* message_type;
   Py_ssize_t message_len;
 
@@ -455,7 +559,7 @@
       descriptor, serialized_pb);
 }
 
-PyObject* Add(PyDescriptorPool* self, PyObject* file_descriptor_proto) {
+static PyObject* Add(PyObject* self, PyObject* file_descriptor_proto) {
   ScopedPyObjectPtr serialized_pb(
       PyObject_CallMethod(file_descriptor_proto, "SerializeToString", NULL));
   if (serialized_pb == NULL) {
@@ -465,35 +569,47 @@
 }
 
 static PyMethodDef Methods[] = {
-  { "Add", (PyCFunction)Add, METH_O,
+  { "Add", Add, METH_O,
     "Adds the FileDescriptorProto and its types to this pool." },
-  { "AddSerializedFile", (PyCFunction)AddSerializedFile, METH_O,
+  { "AddSerializedFile", AddSerializedFile, METH_O,
     "Adds a serialized FileDescriptorProto to this pool." },
 
   // TODO(amauryfa): Understand why the Python implementation differs from
   // this one, ask users to use another API and deprecate these functions.
-  { "AddFileDescriptor", (PyCFunction)AddFileDescriptor, METH_O,
+  { "AddFileDescriptor", AddFileDescriptor, METH_O,
     "No-op. Add() must have been called before." },
-  { "AddDescriptor", (PyCFunction)AddDescriptor, METH_O,
+  { "AddDescriptor", AddDescriptor, METH_O,
     "No-op. Add() must have been called before." },
-  { "AddEnumDescriptor", (PyCFunction)AddEnumDescriptor, METH_O,
+  { "AddEnumDescriptor", AddEnumDescriptor, METH_O,
+    "No-op. Add() must have been called before." },
+  { "AddExtensionDescriptor", AddExtensionDescriptor, METH_O,
+    "No-op. Add() must have been called before." },
+  { "AddServiceDescriptor", AddServiceDescriptor, METH_O,
     "No-op. Add() must have been called before." },
 
-  { "FindFileByName", (PyCFunction)FindFileByName, METH_O,
+  { "FindFileByName", FindFileByName, METH_O,
     "Searches for a file descriptor by its .proto name." },
-  { "FindMessageTypeByName", (PyCFunction)FindMessageByName, METH_O,
+  { "FindMessageTypeByName", FindMessageByName, METH_O,
     "Searches for a message descriptor by full name." },
-  { "FindFieldByName", (PyCFunction)FindFieldByName, METH_O,
+  { "FindFieldByName", FindFieldByNameMethod, METH_O,
     "Searches for a field descriptor by full name." },
-  { "FindExtensionByName", (PyCFunction)FindExtensionByName, METH_O,
+  { "FindExtensionByName", FindExtensionByNameMethod, METH_O,
     "Searches for extension descriptor by full name." },
-  { "FindEnumTypeByName", (PyCFunction)FindEnumTypeByName, METH_O,
+  { "FindEnumTypeByName", FindEnumTypeByNameMethod, METH_O,
     "Searches for enum type descriptor by full name." },
-  { "FindOneofByName", (PyCFunction)FindOneofByName, METH_O,
+  { "FindOneofByName", FindOneofByNameMethod, METH_O,
     "Searches for oneof descriptor by full name." },
+  { "FindServiceByName", FindServiceByName, METH_O,
+    "Searches for service descriptor by full name." },
+  { "FindMethodByName", FindMethodByName, METH_O,
+    "Searches for method descriptor by full name." },
 
-  { "FindFileContainingSymbol", (PyCFunction)FindFileContainingSymbol, METH_O,
+  { "FindFileContainingSymbol", FindFileContainingSymbol, METH_O,
     "Gets the FileDescriptor containing the specified symbol." },
+  { "FindExtensionByNumber", FindExtensionByNumber, METH_VARARGS,
+    "Gets the extension descriptor for the given number." },
+  { "FindAllExtensions", FindAllExtensions, METH_O,
+    "Gets all known extensions of the given message descriptor." },
   {NULL}
 };
 
@@ -504,7 +620,7 @@
   FULL_MODULE_NAME ".DescriptorPool",  // tp_name
   sizeof(PyDescriptorPool),            // tp_basicsize
   0,                                   // tp_itemsize
-  (destructor)cdescriptor_pool::Dealloc,  // tp_dealloc
+  cdescriptor_pool::Dealloc,           // tp_dealloc
   0,                                   // tp_print
   0,                                   // tp_getattr
   0,                                   // tp_setattr
diff --git a/python/google/protobuf/pyext/descriptor_pool.h b/python/google/protobuf/pyext/descriptor_pool.h
index 16bc910..53ee53d 100644
--- a/python/google/protobuf/pyext/descriptor_pool.h
+++ b/python/google/protobuf/pyext/descriptor_pool.h
@@ -38,10 +38,13 @@
 
 namespace google {
 namespace protobuf {
-class MessageFactory;
-
 namespace python {
 
+struct PyMessageFactory;
+
+// The (meta) type of all Messages classes.
+struct CMessageClass;
+
 // Wraps operations to the global DescriptorPool which contains information
 // about all messages and fields.
 //
@@ -66,20 +69,10 @@
   // This pointer is owned.
   const DescriptorDatabase* database;
 
-  // DynamicMessageFactory used to create C++ instances of messages.
-  // This object cache the descriptors that were used, so the DescriptorPool
-  // needs to get rid of it before it can delete itself.
-  //
-  // Note: A C++ MessageFactory is different from the Python MessageFactory.
-  // The C++ one creates messages, when the Python one creates classes.
-  MessageFactory* message_factory;
-
-  // Make our own mapping to retrieve Python classes from C++ descriptors.
-  //
-  // Descriptor pointers stored here are owned by the DescriptorPool above.
-  // Python references to classes are owned by this PyDescriptorPool.
-  typedef hash_map<const Descriptor*, PyObject*> ClassesByMessageMap;
-  ClassesByMessageMap* classes_by_descriptor;
+  // The preferred MessageFactory to be used by descriptors.
+  // TODO(amauryfa): Don't create the Factory from the DescriptorPool, but
+  // use the one passed while creating message classes. And remove this member.
+  PyMessageFactory* py_message_factory;
 
   // Cache the options for any kind of descriptor.
   // Descriptor pointers are owned by the DescriptorPool above.
@@ -92,24 +85,12 @@
 
 namespace cdescriptor_pool {
 
+
 // Looks up a message by name.
 // Returns a message Descriptor, or NULL if not found.
 const Descriptor* FindMessageTypeByName(PyDescriptorPool* self,
                                         const string& name);
 
-// Registers a new Python class for the given message descriptor.
-// On error, returns -1 with a Python exception set.
-int RegisterMessageClass(PyDescriptorPool* self,
-                         const Descriptor* message_descriptor,
-                         PyObject* message_class);
-
-// Retrieves the Python class registered with the given message descriptor.
-//
-// Returns a *borrowed* reference if found, otherwise returns NULL with an
-// exception set.
-PyObject* GetMessageClass(PyDescriptorPool* self,
-                          const Descriptor* message_descriptor);
-
 // The functions below are also exposed as methods of the DescriptorPool type.
 
 // Looks up a message by name. Returns a PyMessageDescriptor corresponding to
diff --git a/python/google/protobuf/pyext/extension_dict.cc b/python/google/protobuf/pyext/extension_dict.cc
index 555bd29..018b5c2 100644
--- a/python/google/protobuf/pyext/extension_dict.cc
+++ b/python/google/protobuf/pyext/extension_dict.cc
@@ -32,19 +32,30 @@
 // Author: tibell@google.com (Johan Tibell)
 
 #include <google/protobuf/pyext/extension_dict.h>
+#include <memory>
 
 #include <google/protobuf/stubs/logging.h>
 #include <google/protobuf/stubs/common.h>
 #include <google/protobuf/descriptor.h>
 #include <google/protobuf/dynamic_message.h>
 #include <google/protobuf/message.h>
+#include <google/protobuf/descriptor.pb.h>
 #include <google/protobuf/pyext/descriptor.h>
-#include <google/protobuf/pyext/descriptor_pool.h>
 #include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/message_factory.h>
 #include <google/protobuf/pyext/repeated_composite_container.h>
 #include <google/protobuf/pyext/repeated_scalar_container.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
-#include <google/protobuf/stubs/shared_ptr.h>
+
+#if PY_MAJOR_VERSION >= 3
+  #if PY_VERSION_HEX < 0x03030000
+    #error "Python 3.0 - 3.2 are not supported."
+  #endif
+  #define PyString_AsStringAndSize(ob, charpp, sizep) \
+    (PyUnicode_Check(ob)? \
+       ((*(charpp) = PyUnicode_AsUTF8AndSize(ob, (sizep))) == NULL? -1: 0): \
+       PyBytes_AsStringAndSize(ob, (charpp), (sizep)))
+#endif
 
 namespace google {
 namespace protobuf {
@@ -60,35 +71,6 @@
 #endif
 }
 
-// TODO(tibell): Use VisitCompositeField.
-int ReleaseExtension(ExtensionDict* self,
-                     PyObject* extension,
-                     const FieldDescriptor* descriptor) {
-  if (descriptor->label() == FieldDescriptor::LABEL_REPEATED) {
-    if (descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-      if (repeated_composite_container::Release(
-              reinterpret_cast<RepeatedCompositeContainer*>(
-                  extension)) < 0) {
-        return -1;
-      }
-    } else {
-      if (repeated_scalar_container::Release(
-              reinterpret_cast<RepeatedScalarContainer*>(
-                  extension)) < 0) {
-        return -1;
-      }
-    }
-  } else if (descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-    if (cmessage::ReleaseSubMessage(
-            self->parent, descriptor,
-            reinterpret_cast<CMessage*>(extension)) < 0) {
-      return -1;
-    }
-  }
-
-  return 0;
-}
-
 PyObject* subscript(ExtensionDict* self, PyObject* key) {
   const FieldDescriptor* descriptor = cmessage::GetExtensionDescriptor(key);
   if (descriptor == NULL) {
@@ -119,6 +101,7 @@
 
   if (descriptor->label() != FieldDescriptor::LABEL_REPEATED &&
       descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
+    // TODO(plabatut): consider building the class on the fly!
     PyObject* sub_message = cmessage::InternalGetSubMessage(
         self->parent, descriptor);
     if (sub_message == NULL) {
@@ -130,9 +113,21 @@
 
   if (descriptor->label() == FieldDescriptor::LABEL_REPEATED) {
     if (descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-      PyObject *message_class = cdescriptor_pool::GetMessageClass(
-          cmessage::GetDescriptorPoolForMessage(self->parent),
+      // On the fly message class creation is needed to support the following
+      // situation:
+      // 1- add FileDescriptor to the pool that contains extensions of a message
+      //    defined by another proto file. Do not create any message classes.
+      // 2- instantiate an extended message, and access the extension using
+      //    the field descriptor.
+      // 3- the extension submessage fails to be returned, because no class has
+      //    been created.
+      // It happens when deserializing text proto format, or when enumerating
+      // fields of a deserialized message.
+      CMessageClass* message_class = message_factory::GetOrCreateMessageClass(
+          cmessage::GetFactoryForMessage(self->parent),
           descriptor->message_type());
+      ScopedPyObjectPtr message_class_handler(
+        reinterpret_cast<PyObject*>(message_class));
       if (message_class == NULL) {
         return NULL;
       }
@@ -183,60 +178,51 @@
   return 0;
 }
 
-PyObject* ClearExtension(ExtensionDict* self, PyObject* extension) {
-  const FieldDescriptor* descriptor =
-      cmessage::GetExtensionDescriptor(extension);
-  if (descriptor == NULL) {
+PyObject* _FindExtensionByName(ExtensionDict* self, PyObject* arg) {
+  char* name;
+  Py_ssize_t name_size;
+  if (PyString_AsStringAndSize(arg, &name, &name_size) < 0) {
     return NULL;
   }
-  PyObject* value = PyDict_GetItem(self->values, extension);
-  if (self->parent) {
-    if (value != NULL) {
-      if (ReleaseExtension(self, value, descriptor) < 0) {
-        return NULL;
+
+  PyDescriptorPool* pool = cmessage::GetFactoryForMessage(self->parent)->pool;
+  const FieldDescriptor* message_extension =
+      pool->pool->FindExtensionByName(string(name, name_size));
+  if (message_extension == NULL) {
+    // Is is the name of a message set extension?
+    const Descriptor* message_descriptor = pool->pool->FindMessageTypeByName(
+        string(name, name_size));
+    if (message_descriptor && message_descriptor->extension_count() > 0) {
+      const FieldDescriptor* extension = message_descriptor->extension(0);
+      if (extension->is_extension() &&
+          extension->containing_type()->options().message_set_wire_format() &&
+          extension->type() == FieldDescriptor::TYPE_MESSAGE &&
+          extension->label() == FieldDescriptor::LABEL_OPTIONAL) {
+        message_extension = extension;
       }
     }
-    if (ScopedPyObjectPtr(cmessage::ClearFieldByDescriptor(
-            self->parent, descriptor)) == NULL) {
-      return NULL;
-    }
   }
-  if (PyDict_DelItem(self->values, extension) < 0) {
-    PyErr_Clear();
-  }
-  Py_RETURN_NONE;
-}
-
-PyObject* HasExtension(ExtensionDict* self, PyObject* extension) {
-  const FieldDescriptor* descriptor =
-      cmessage::GetExtensionDescriptor(extension);
-  if (descriptor == NULL) {
-    return NULL;
-  }
-  if (self->parent) {
-    return cmessage::HasFieldByDescriptor(self->parent, descriptor);
-  } else {
-    int exists = PyDict_Contains(self->values, extension);
-    if (exists < 0) {
-      return NULL;
-    }
-    return PyBool_FromLong(exists);
-  }
-}
-
-PyObject* _FindExtensionByName(ExtensionDict* self, PyObject* name) {
-  ScopedPyObjectPtr extensions_by_name(PyObject_GetAttrString(
-      reinterpret_cast<PyObject*>(self->parent), "_extensions_by_name"));
-  if (extensions_by_name == NULL) {
-    return NULL;
-  }
-  PyObject* result = PyDict_GetItem(extensions_by_name.get(), name);
-  if (result == NULL) {
+  if (message_extension == NULL) {
     Py_RETURN_NONE;
-  } else {
-    Py_INCREF(result);
-    return result;
   }
+
+  return PyFieldDescriptor_FromDescriptor(message_extension);
+}
+
+PyObject* _FindExtensionByNumber(ExtensionDict* self, PyObject* arg) {
+  int64 number = PyLong_AsLong(arg);
+  if (number == -1 && PyErr_Occurred()) {
+    return NULL;
+  }
+
+  PyDescriptorPool* pool = cmessage::GetFactoryForMessage(self->parent)->pool;
+  const FieldDescriptor* message_extension = pool->pool->FindExtensionByNumber(
+      self->parent->message->GetDescriptor(), number);
+  if (message_extension == NULL) {
+    Py_RETURN_NONE;
+  }
+
+  return PyFieldDescriptor_FromDescriptor(message_extension);
 }
 
 ExtensionDict* NewExtensionDict(CMessage *parent) {
@@ -267,10 +253,10 @@
 
 #define EDMETHOD(name, args, doc) { #name, (PyCFunction)name, args, doc }
 static PyMethodDef Methods[] = {
-  EDMETHOD(ClearExtension, METH_O, "Clears an extension from the object."),
-  EDMETHOD(HasExtension, METH_O, "Checks if the object has an extension."),
   EDMETHOD(_FindExtensionByName, METH_O,
            "Finds an extension by name."),
+  EDMETHOD(_FindExtensionByNumber, METH_O,
+           "Finds an extension by field number."),
   { NULL, NULL }
 };
 
diff --git a/python/google/protobuf/pyext/extension_dict.h b/python/google/protobuf/pyext/extension_dict.h
index d92cf95..0de2c4e 100644
--- a/python/google/protobuf/pyext/extension_dict.h
+++ b/python/google/protobuf/pyext/extension_dict.h
@@ -37,9 +37,8 @@
 #include <Python.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
+
+#include <google/protobuf/pyext/message.h>
 
 namespace google {
 namespace protobuf {
@@ -47,16 +46,8 @@
 class Message;
 class FieldDescriptor;
 
-#ifdef _SHARED_PTR_H
-using std::shared_ptr;
-#else
-using internal::shared_ptr;
-#endif
-
 namespace python {
 
-struct CMessage;
-
 typedef struct ExtensionDict {
   PyObject_HEAD;
 
@@ -64,7 +55,7 @@
   // proto tree.  Every Python container class holds a
   // reference to it in order to keep it alive as long as there's a
   // Python object that references any part of the tree.
-  shared_ptr<Message> owner;
+  CMessage::OwnerRef owner;
 
   // Weak reference to parent message. Used to make sure
   // the parent is writable when an extension field is modified.
@@ -86,43 +77,6 @@
 // Builds an Extensions dict for a specific message.
 ExtensionDict* NewExtensionDict(CMessage *parent);
 
-// Gets the number of extension values in this ExtensionDict as a python object.
-//
-// Returns a new reference.
-PyObject* len(ExtensionDict* self);
-
-// Releases extensions referenced outside this dictionary to keep outside
-// references alive.
-//
-// Returns 0 on success, -1 on failure.
-int ReleaseExtension(ExtensionDict* self,
-                     PyObject* extension,
-                     const FieldDescriptor* descriptor);
-
-// Gets an extension from the dict for the given extension descriptor.
-//
-// Returns a new reference.
-PyObject* subscript(ExtensionDict* self, PyObject* key);
-
-// Assigns a value to an extension in the dict. Can only be used for singular
-// simple types.
-//
-// Returns 0 on success, -1 on failure.
-int ass_subscript(ExtensionDict* self, PyObject* key, PyObject* value);
-
-// Clears an extension from the dict. Will release the extension if there
-// is still an external reference left to it.
-//
-// Returns None on success.
-PyObject* ClearExtension(ExtensionDict* self,
-                                       PyObject* extension);
-
-// Gets an extension from the dict given the extension name as opposed to
-// descriptor.
-//
-// Returns a new reference.
-PyObject* _FindExtensionByName(ExtensionDict* self, PyObject* name);
-
 }  // namespace extension_dict
 }  // namespace python
 }  // namespace protobuf
diff --git a/python/google/protobuf/pyext/map_container.cc b/python/google/protobuf/pyext/map_container.cc
index df9138a..6d7ee28 100644
--- a/python/google/protobuf/pyext/map_container.cc
+++ b/python/google/protobuf/pyext/map_container.cc
@@ -32,13 +32,16 @@
 
 #include <google/protobuf/pyext/map_container.h>
 
+#include <memory>
+
 #include <google/protobuf/stubs/logging.h>
 #include <google/protobuf/stubs/common.h>
-#include <google/protobuf/stubs/scoped_ptr.h>
 #include <google/protobuf/map_field.h>
 #include <google/protobuf/map.h>
 #include <google/protobuf/message.h>
+#include <google/protobuf/pyext/message_factory.h>
 #include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/repeated_composite_container.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
 
 #if PY_MAJOR_VERSION >= 3
@@ -70,7 +73,7 @@
 struct MapIterator {
   PyObject_HEAD;
 
-  scoped_ptr< ::google::protobuf::MapIterator> iter;
+  std::unique_ptr<::google::protobuf::MapIterator> iter;
 
   // A pointer back to the container, so we can notice changes to the version.
   // We own a ref on this.
@@ -88,7 +91,7 @@
   // as this iterator does.  This is solely for the benefit of the MapIterator
   // destructor -- we should never actually access the iterator in this state
   // except to delete it.
-  shared_ptr<Message> owner;
+  CMessage::OwnerRef owner;
 
   // The version of the map when we took the iterator to it.
   //
@@ -324,6 +327,33 @@
   Py_RETURN_NONE;
 }
 
+PyObject* GetEntryClass(PyObject* _self) {
+  MapContainer* self = GetMap(_self);
+  CMessageClass* message_class = message_factory::GetMessageClass(
+      cmessage::GetFactoryForMessage(self->parent),
+      self->parent_field_descriptor->message_type());
+  Py_XINCREF(message_class);
+  return reinterpret_cast<PyObject*>(message_class);
+}
+
+PyObject* MergeFrom(PyObject* _self, PyObject* arg) {
+  MapContainer* self = GetMap(_self);
+  MapContainer* other_map = GetMap(arg);
+  Message* message = self->GetMutableMessage();
+  const Message* other_message = other_map->message;
+  const Reflection* reflection = message->GetReflection();
+  const Reflection* other_reflection = other_message->GetReflection();
+  int count = other_reflection->FieldSize(
+      *other_message, other_map->parent_field_descriptor);
+  for (int i = 0 ; i < count; i ++) {
+    reflection->AddMessage(message, self->parent_field_descriptor)->MergeFrom(
+        other_reflection->GetRepeatedMessage(
+            *other_message, other_map->parent_field_descriptor, i));
+  }
+  self->version++;
+  Py_RETURN_NONE;
+}
+
 PyObject* MapReflectionFriend::Contains(PyObject* _self, PyObject* key) {
   MapContainer* self = GetMap(_self);
 
@@ -344,9 +374,10 @@
 }
 
 // Initializes the underlying Message object of "to" so it becomes a new parent
-// repeated scalar, and copies all the values from "from" to it. A child scalar
+// map container, and copies all the values from "from" to it. A child map
 // container can be released by passing it as both from and to (e.g. making it
 // the recipient of the new parent message and copying the values from itself).
+// In fact, this is the only supported use at the moment.
 static int InitializeAndCopyToParentContainer(MapContainer* from,
                                               MapContainer* to) {
   // For now we require from == to, re-evaluate if we want to support deep copy
@@ -358,7 +389,7 @@
     // A somewhat roundabout way of copying just one field from old_message to
     // new_message.  This is the best we can do with what Reflection gives us.
     Message* mutable_old = from->GetMutableMessage();
-    vector<const FieldDescriptor*> fields;
+    std::vector<const FieldDescriptor*> fields;
     fields.push_back(from->parent_field_descriptor);
 
     // Move the map field into the new message.
@@ -395,12 +426,7 @@
     return NULL;
   }
 
-#if PY_MAJOR_VERSION >= 3
-  ScopedPyObjectPtr obj(PyType_GenericAlloc(
-        reinterpret_cast<PyTypeObject *>(ScalarMapContainer_Type), 0));
-#else
-  ScopedPyObjectPtr obj(PyType_GenericAlloc(&ScalarMapContainer_Type, 0));
-#endif
+  ScopedPyObjectPtr obj(PyType_GenericAlloc(ScalarMapContainer_Type, 0));
   if (obj.get() == NULL) {
     return PyErr_Format(PyExc_RuntimeError,
                         "Could not allocate new container.");
@@ -522,6 +548,10 @@
     "Removes all elements from the map." },
   { "get", ScalarMapGet, METH_VARARGS,
     "Gets the value for the given key if present, or otherwise a default" },
+  { "GetEntryClass", (PyCFunction)GetEntryClass, METH_NOARGS,
+    "Return the class used to build Entries of (key, value) pairs." },
+  { "MergeFrom", (PyCFunction)MergeFrom, METH_O,
+    "Merges a map into the current map." },
   /*
   { "__deepcopy__", (PyCFunction)DeepCopy, METH_VARARGS,
     "Makes a deep copy of the class." },
@@ -531,6 +561,7 @@
   {NULL, NULL},
 };
 
+PyTypeObject *ScalarMapContainer_Type;
 #if PY_MAJOR_VERSION >= 3
   static PyType_Slot ScalarMapContainer_Type_slots[] = {
       {Py_tp_dealloc, (void *)ScalarMapDealloc},
@@ -549,7 +580,6 @@
       Py_TPFLAGS_DEFAULT,
       ScalarMapContainer_Type_slots
   };
-  PyObject *ScalarMapContainer_Type;
 #else
   static PyMappingMethods ScalarMapMappingMethods = {
     MapReflectionFriend::Length,             // mp_length
@@ -557,7 +587,7 @@
     MapReflectionFriend::ScalarMapSetItem,   // mp_ass_subscript
   };
 
-  PyTypeObject ScalarMapContainer_Type = {
+  PyTypeObject _ScalarMapContainer_Type = {
     PyVarObject_HEAD_INIT(&PyType_Type, 0)
     FULL_MODULE_NAME ".ScalarMapContainer",  //  tp_name
     sizeof(MapContainer),                //  tp_basicsize
@@ -610,8 +640,7 @@
   PyObject* ret = PyDict_GetItem(self->message_dict, key.get());
 
   if (ret == NULL) {
-    CMessage* cmsg = cmessage::NewEmptyMessage(self->subclass_init,
-                                               message->GetDescriptor());
+    CMessage* cmsg = cmessage::NewEmptyMessage(self->message_class);
     ret = reinterpret_cast<PyObject*>(cmsg);
 
     if (cmsg == NULL) {
@@ -634,17 +663,12 @@
 
 PyObject* NewMessageMapContainer(
     CMessage* parent, const google::protobuf::FieldDescriptor* parent_field_descriptor,
-    PyObject* concrete_class) {
+    CMessageClass* message_class) {
   if (!CheckFieldBelongsToMessage(parent_field_descriptor, parent->message)) {
     return NULL;
   }
 
-#if PY_MAJOR_VERSION >= 3
-  PyObject* obj = PyType_GenericAlloc(
-        reinterpret_cast<PyTypeObject *>(MessageMapContainer_Type), 0);
-#else
-  PyObject* obj = PyType_GenericAlloc(&MessageMapContainer_Type, 0);
-#endif
+  PyObject* obj = PyType_GenericAlloc(MessageMapContainer_Type, 0);
   if (obj == NULL) {
     return PyErr_Format(PyExc_RuntimeError,
                         "Could not allocate new container.");
@@ -669,8 +693,8 @@
                         "Could not allocate message dict.");
   }
 
-  Py_INCREF(concrete_class);
-  self->subclass_init = concrete_class;
+  Py_INCREF(message_class);
+  self->message_class = message_class;
 
   if (self->key_field_descriptor == NULL ||
       self->value_field_descriptor == NULL) {
@@ -705,8 +729,33 @@
   }
 
   // Delete key from map.
-  if (reflection->DeleteMapValue(message, self->parent_field_descriptor,
+  if (reflection->ContainsMapKey(*message, self->parent_field_descriptor,
                                  map_key)) {
+    // Delete key from CMessage dict.
+    MapValueRef value;
+    reflection->InsertOrLookupMapValue(message, self->parent_field_descriptor,
+                                       map_key, &value);
+    ScopedPyObjectPtr key(PyLong_FromVoidPtr(value.MutableMessageValue()));
+
+    PyObject* cmsg_value = PyDict_GetItem(self->message_dict, key.get());
+    if (cmsg_value) {
+      // Need to keep CMessage stay alive if it is still referenced after
+      // deletion. Makes a new message and swaps values into CMessage
+      // instead of just removing.
+      CMessage* cmsg =  reinterpret_cast<CMessage*>(cmsg_value);
+      Message* msg = cmsg->message;
+      cmsg->owner.reset(msg->New());
+      cmsg->message = cmsg->owner.get();
+      cmsg->parent = NULL;
+      msg->GetReflection()->Swap(msg, cmsg->message);
+      if (PyDict_DelItem(self->message_dict, key.get()) < 0) {
+        return -1;
+      }
+    }
+
+    // Delete key from map.
+    reflection->DeleteMapValue(message, self->parent_field_descriptor,
+                               map_key);
     return 0;
   } else {
     PyErr_Format(PyExc_KeyError, "Key not present in map");
@@ -763,6 +812,7 @@
   MessageMapContainer* self = GetMessageMap(_self);
   self->owner.reset();
   Py_DECREF(self->message_dict);
+  Py_DECREF(self->message_class);
   Py_TYPE(_self)->tp_free(_self);
 }
 
@@ -775,6 +825,10 @@
     "Gets the value for the given key if present, or otherwise a default" },
   { "get_or_create", MapReflectionFriend::MessageMapGetItem, METH_O,
     "Alias for getitem, useful to make explicit that the map is mutated." },
+  { "GetEntryClass", (PyCFunction)GetEntryClass, METH_NOARGS,
+    "Return the class used to build Entries of (key, value) pairs." },
+  { "MergeFrom", (PyCFunction)MergeFrom, METH_O,
+    "Merges a map into the current map." },
   /*
   { "__deepcopy__", (PyCFunction)DeepCopy, METH_VARARGS,
     "Makes a deep copy of the class." },
@@ -784,6 +838,7 @@
   {NULL, NULL},
 };
 
+PyTypeObject *MessageMapContainer_Type;
 #if PY_MAJOR_VERSION >= 3
   static PyType_Slot MessageMapContainer_Type_slots[] = {
       {Py_tp_dealloc, (void *)MessageMapDealloc},
@@ -802,8 +857,6 @@
       Py_TPFLAGS_DEFAULT,
       MessageMapContainer_Type_slots
   };
-
-  PyObject *MessageMapContainer_Type;
 #else
   static PyMappingMethods MessageMapMappingMethods = {
     MapReflectionFriend::Length,              // mp_length
@@ -811,7 +864,7 @@
     MapReflectionFriend::MessageMapSetItem,   // mp_ass_subscript
   };
 
-  PyTypeObject MessageMapContainer_Type = {
+  PyTypeObject _MessageMapContainer_Type = {
     PyVarObject_HEAD_INIT(&PyType_Type, 0)
     FULL_MODULE_NAME ".MessageMapContainer",  //  tp_name
     sizeof(MessageMapContainer),         //  tp_basicsize
@@ -960,6 +1013,63 @@
   0,                                   //  tp_init
 };
 
+bool InitMapContainers() {
+  // ScalarMapContainer_Type derives from our MutableMapping type.
+  ScopedPyObjectPtr containers(PyImport_ImportModule(
+      "google.protobuf.internal.containers"));
+  if (containers == NULL) {
+    return false;
+  }
+
+  ScopedPyObjectPtr mutable_mapping(
+      PyObject_GetAttrString(containers.get(), "MutableMapping"));
+  if (mutable_mapping == NULL) {
+    return false;
+  }
+
+  if (!PyObject_TypeCheck(mutable_mapping.get(), &PyType_Type)) {
+    return false;
+  }
+
+  Py_INCREF(mutable_mapping.get());
+#if PY_MAJOR_VERSION >= 3
+  PyObject* bases = PyTuple_New(1);
+  PyTuple_SET_ITEM(bases, 0, mutable_mapping.get());
+
+  ScalarMapContainer_Type = reinterpret_cast<PyTypeObject*>(
+      PyType_FromSpecWithBases(&ScalarMapContainer_Type_spec, bases));
+#else
+  _ScalarMapContainer_Type.tp_base =
+      reinterpret_cast<PyTypeObject*>(mutable_mapping.get());
+
+  if (PyType_Ready(&_ScalarMapContainer_Type) < 0) {
+    return false;
+  }
+
+  ScalarMapContainer_Type = &_ScalarMapContainer_Type;
+#endif
+
+  if (PyType_Ready(&MapIterator_Type) < 0) {
+    return false;
+  }
+
+#if PY_MAJOR_VERSION >= 3
+  MessageMapContainer_Type = reinterpret_cast<PyTypeObject*>(
+      PyType_FromSpecWithBases(&MessageMapContainer_Type_spec, bases));
+#else
+  Py_INCREF(mutable_mapping.get());
+  _MessageMapContainer_Type.tp_base =
+      reinterpret_cast<PyTypeObject*>(mutable_mapping.get());
+
+  if (PyType_Ready(&_MessageMapContainer_Type) < 0) {
+    return false;
+  }
+
+  MessageMapContainer_Type = &_MessageMapContainer_Type;
+#endif
+  return true;
+}
+
 }  // namespace python
 }  // namespace protobuf
 }  // namespace google
diff --git a/python/google/protobuf/pyext/map_container.h b/python/google/protobuf/pyext/map_container.h
index ddf94be..111fafb 100644
--- a/python/google/protobuf/pyext/map_container.h
+++ b/python/google/protobuf/pyext/map_container.h
@@ -34,27 +34,19 @@
 #include <Python.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 
 #include <google/protobuf/descriptor.h>
 #include <google/protobuf/message.h>
+#include <google/protobuf/pyext/message.h>
 
 namespace google {
 namespace protobuf {
 
 class Message;
 
-#ifdef _SHARED_PTR_H
-using std::shared_ptr;
-#else
-using internal::shared_ptr;
-#endif
-
 namespace python {
 
-struct CMessage;
+struct CMessageClass;
 
 // This struct is used directly for ScalarMap, and is the base class of
 // MessageMapContainer, which is used for MessageMap.
@@ -65,7 +57,7 @@
   // proto tree.  Every Python MapContainer holds a
   // reference to it in order to keep it alive as long as there's a
   // Python object that references any part of the tree.
-  shared_ptr<Message> owner;
+  CMessage::OwnerRef owner;
 
   // Pointer to the C++ Message that contains this container.  The
   // MapContainer does not own this pointer.
@@ -98,29 +90,21 @@
   int Release();
 
   // Set the owner field of self and any children of self.
-  void SetOwner(const shared_ptr<Message>& new_owner) {
-    owner = new_owner;
-  }
+  void SetOwner(const CMessage::OwnerRef& new_owner) { owner = new_owner; }
 };
 
 struct MessageMapContainer : public MapContainer {
-  // A callable that is used to create new child messages.
-  PyObject* subclass_init;
+  // The type used to create new child messages.
+  CMessageClass* message_class;
 
   // A dict mapping Message* -> CMessage.
   PyObject* message_dict;
 };
 
-#if PY_MAJOR_VERSION >= 3
-  extern PyObject *MessageMapContainer_Type;
-  extern PyType_Spec MessageMapContainer_Type_spec;
-  extern PyObject *ScalarMapContainer_Type;
-  extern PyType_Spec ScalarMapContainer_Type_spec;
-#else
-  extern PyTypeObject MessageMapContainer_Type;
-  extern PyTypeObject ScalarMapContainer_Type;
-#endif
+bool InitMapContainers();
 
+extern PyTypeObject* MessageMapContainer_Type;
+extern PyTypeObject* ScalarMapContainer_Type;
 extern PyTypeObject MapIterator_Type;  // Both map types use the same iterator.
 
 // Builds a MapContainer object, from a parent message and a
@@ -132,7 +116,7 @@
 // field descriptor.
 extern PyObject* NewMessageMapContainer(
     CMessage* parent, const FieldDescriptor* parent_field_descriptor,
-    PyObject* concrete_class);
+    CMessageClass* message_class);
 
 }  // namespace python
 }  // namespace protobuf
diff --git a/python/google/protobuf/pyext/message.cc b/python/google/protobuf/pyext/message.cc
index 863cde0..5893533 100644
--- a/python/google/protobuf/pyext/message.cc
+++ b/python/google/protobuf/pyext/message.cc
@@ -35,9 +35,6 @@
 
 #include <map>
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 #include <string>
 #include <vector>
 #include <structmember.h>  // A Python header file.
@@ -52,6 +49,7 @@
 #include <google/protobuf/stubs/common.h>
 #include <google/protobuf/stubs/logging.h>
 #include <google/protobuf/io/coded_stream.h>
+#include <google/protobuf/io/zero_copy_stream_impl_lite.h>
 #include <google/protobuf/util/message_differencer.h>
 #include <google/protobuf/descriptor.h>
 #include <google/protobuf/message.h>
@@ -63,11 +61,11 @@
 #include <google/protobuf/pyext/repeated_composite_container.h>
 #include <google/protobuf/pyext/repeated_scalar_container.h>
 #include <google/protobuf/pyext/map_container.h>
+#include <google/protobuf/pyext/message_factory.h>
+#include <google/protobuf/pyext/safe_numerics.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
-#include <google/protobuf/stubs/strutil.h>
 
 #if PY_MAJOR_VERSION >= 3
-  #define PyInt_Check PyLong_Check
   #define PyInt_AsLong PyLong_AsLong
   #define PyInt_FromLong PyLong_FromLong
   #define PyInt_FromSize_t PyLong_FromSize_t
@@ -91,42 +89,26 @@
 namespace python {
 
 static PyObject* kDESCRIPTOR;
-static PyObject* k_extensions_by_name;
-static PyObject* k_extensions_by_number;
 PyObject* EnumTypeWrapper_class;
 static PyObject* PythonMessage_class;
 static PyObject* kEmptyWeakref;
 static PyObject* WKT_classes = NULL;
 
-// Defines the Metaclass of all Message classes.
-// It allows us to cache some C++ pointers in the class object itself, they are
-// faster to extract than from the type's dictionary.
-
-struct PyMessageMeta {
-  // This is how CPython subclasses C structures: the base structure must be
-  // the first member of the object.
-  PyHeapTypeObject super;
-
-  // C++ descriptor of this message.
-  const Descriptor* message_descriptor;
-
-  // Owned reference, used to keep the pointer above alive.
-  PyObject* py_message_descriptor;
-
-  // The Python DescriptorPool used to create the class. It is needed to resolve
-  // fields descriptors, including extensions fields; its C++ MessageFactory is
-  // used to instantiate submessages.
-  // This can be different from DESCRIPTOR.file.pool, in the case of a custom
-  // DescriptorPool which defines new extensions.
-  // We own the reference, because it's important to keep the descriptors and
-  // factory alive.
-  PyDescriptorPool* py_descriptor_pool;
-};
-
 namespace message_meta {
 
 static int InsertEmptyWeakref(PyTypeObject* base);
 
+namespace {
+// Copied oveer from internal 'google/protobuf/stubs/strutil.h'.
+inline void UpperString(string * s) {
+  string::iterator end = s->end();
+  for (string::iterator i = s->begin(); i != end; ++i) {
+    // toupper() changes based on locale.  We don't want this!
+    if ('a' <= *i && *i <= 'z') *i += 'A' - 'a';
+  }
+}
+}
+
 // Add the number of a field descriptor to the containing message class.
 // Equivalent to:
 //   _cls.<field>_FIELD_NUMBER = <number>
@@ -152,19 +134,6 @@
 
 // Finalize the creation of the Message class.
 static int AddDescriptors(PyObject* cls, const Descriptor* descriptor) {
-  // If there are extension_ranges, the message is "extendable", and extension
-  // classes will register themselves in this class.
-  if (descriptor->extension_range_count() > 0) {
-    ScopedPyObjectPtr by_name(PyDict_New());
-    if (PyObject_SetAttr(cls, k_extensions_by_name, by_name.get()) < 0) {
-      return -1;
-    }
-    ScopedPyObjectPtr by_number(PyDict_New());
-    if (PyObject_SetAttr(cls, k_extensions_by_number, by_number.get()) < 0) {
-      return -1;
-    }
-  }
-
   // For each field set: cls.<field>_FIELD_NUMBER = <number>
   for (int i = 0; i < descriptor->field_count(); ++i) {
     if (!AddFieldNumberToClass(cls, descriptor->field(i))) {
@@ -173,10 +142,6 @@
   }
 
   // For each enum set cls.<enum name> = EnumTypeWrapper(<enum descriptor>).
-  //
-  // The enum descriptor we get from
-  // <messagedescriptor>.enum_types_by_name[name]
-  // which was built previously.
   for (int i = 0; i < descriptor->enum_type_count(); ++i) {
     const EnumDescriptor* enum_descriptor = descriptor->enum_type(i);
     ScopedPyObjectPtr enum_type(
@@ -273,6 +238,12 @@
     return NULL;
   }
 
+  // Messages have no __dict__
+  ScopedPyObjectPtr slots(PyTuple_New(0));
+  if (PyDict_SetItemString(dict, "__slots__", slots.get()) < 0) {
+    return NULL;
+  }
+
   // Build the arguments to the base metaclass.
   // We change the __bases__ classes.
   ScopedPyObjectPtr new_args;
@@ -309,7 +280,7 @@
   if (result == NULL) {
     return NULL;
   }
-  PyMessageMeta* newtype = reinterpret_cast<PyMessageMeta*>(result.get());
+  CMessageClass* newtype = reinterpret_cast<CMessageClass*>(result.get());
 
   // Insert the empty weakref into the base classes.
   if (InsertEmptyWeakref(
@@ -329,16 +300,19 @@
   newtype->message_descriptor = descriptor;
   // TODO(amauryfa): Don't always use the canonical pool of the descriptor,
   // use the MessageFactory optionally passed in the class dict.
-  newtype->py_descriptor_pool = GetDescriptorPool_FromPool(
-      descriptor->file()->pool());
-  if (newtype->py_descriptor_pool == NULL) {
+  PyDescriptorPool* py_descriptor_pool =
+      GetDescriptorPool_FromPool(descriptor->file()->pool());
+  if (py_descriptor_pool == NULL) {
     return NULL;
   }
-  Py_INCREF(newtype->py_descriptor_pool);
+  newtype->py_message_factory = py_descriptor_pool->py_message_factory;
+  Py_INCREF(newtype->py_message_factory);
 
-  // Add the message to the DescriptorPool.
-  if (cdescriptor_pool::RegisterMessageClass(newtype->py_descriptor_pool,
-                                             descriptor, result.get()) < 0) {
+  // Register the message in the MessageFactory.
+  // TODO(amauryfa): Move this call to MessageFactory.GetPrototype() when the
+  // MessageFactory is fully implemented in C++.
+  if (message_factory::RegisterMessageClass(newtype->py_message_factory,
+                                            descriptor, newtype) < 0) {
     return NULL;
   }
 
@@ -349,9 +323,9 @@
   return result.release();
 }
 
-static void Dealloc(PyMessageMeta *self) {
-  Py_DECREF(self->py_message_descriptor);
-  Py_DECREF(self->py_descriptor_pool);
+static void Dealloc(CMessageClass *self) {
+  Py_XDECREF(self->py_message_descriptor);
+  Py_XDECREF(self->py_message_factory);
   Py_TYPE(self)->tp_free(reinterpret_cast<PyObject*>(self));
 }
 
@@ -376,12 +350,67 @@
 #endif  // PY_MAJOR_VERSION >= 3
 }
 
+// The _extensions_by_name dictionary is built on every access.
+// TODO(amauryfa): Migrate all users to pool.FindAllExtensions()
+static PyObject* GetExtensionsByName(CMessageClass *self, void *closure) {
+  const PyDescriptorPool* pool = self->py_message_factory->pool;
+
+  std::vector<const FieldDescriptor*> extensions;
+  pool->pool->FindAllExtensions(self->message_descriptor, &extensions);
+
+  ScopedPyObjectPtr result(PyDict_New());
+  for (int i = 0; i < extensions.size(); i++) {
+    ScopedPyObjectPtr extension(
+        PyFieldDescriptor_FromDescriptor(extensions[i]));
+    if (extension == NULL) {
+      return NULL;
+    }
+    if (PyDict_SetItemString(result.get(), extensions[i]->full_name().c_str(),
+                             extension.get()) < 0) {
+      return NULL;
+    }
+  }
+  return result.release();
+}
+
+// The _extensions_by_number dictionary is built on every access.
+// TODO(amauryfa): Migrate all users to pool.FindExtensionByNumber()
+static PyObject* GetExtensionsByNumber(CMessageClass *self, void *closure) {
+  const PyDescriptorPool* pool = self->py_message_factory->pool;
+
+  std::vector<const FieldDescriptor*> extensions;
+  pool->pool->FindAllExtensions(self->message_descriptor, &extensions);
+
+  ScopedPyObjectPtr result(PyDict_New());
+  for (int i = 0; i < extensions.size(); i++) {
+    ScopedPyObjectPtr extension(
+        PyFieldDescriptor_FromDescriptor(extensions[i]));
+    if (extension == NULL) {
+      return NULL;
+    }
+    ScopedPyObjectPtr number(PyInt_FromLong(extensions[i]->number()));
+    if (number == NULL) {
+      return NULL;
+    }
+    if (PyDict_SetItem(result.get(), number.get(), extension.get()) < 0) {
+      return NULL;
+    }
+  }
+  return result.release();
+}
+
+static PyGetSetDef Getters[] = {
+  {"_extensions_by_name", (getter)GetExtensionsByName, NULL},
+  {"_extensions_by_number", (getter)GetExtensionsByNumber, NULL},
+  {NULL}
+};
+
 }  // namespace message_meta
 
-PyTypeObject PyMessageMeta_Type = {
+PyTypeObject CMessageClass_Type = {
   PyVarObject_HEAD_INIT(&PyType_Type, 0)
   FULL_MODULE_NAME ".MessageMeta",     // tp_name
-  sizeof(PyMessageMeta),               // tp_basicsize
+  sizeof(CMessageClass),               // tp_basicsize
   0,                                   // tp_itemsize
   (destructor)message_meta::Dealloc,   // tp_dealloc
   0,                                   // tp_print
@@ -408,7 +437,7 @@
   0,                                   // tp_iternext
   0,                                   // tp_methods
   0,                                   // tp_members
-  0,                                   // tp_getset
+  message_meta::Getters,               // tp_getset
   0,                                   // tp_base
   0,                                   // tp_dict
   0,                                   // tp_descr_get
@@ -419,16 +448,16 @@
   message_meta::New,                   // tp_new
 };
 
-static PyMessageMeta* CheckMessageClass(PyTypeObject* cls) {
-  if (!PyObject_TypeCheck(cls, &PyMessageMeta_Type)) {
+static CMessageClass* CheckMessageClass(PyTypeObject* cls) {
+  if (!PyObject_TypeCheck(cls, &CMessageClass_Type)) {
     PyErr_Format(PyExc_TypeError, "Class %s is not a Message", cls->tp_name);
     return NULL;
   }
-  return reinterpret_cast<PyMessageMeta*>(cls);
+  return reinterpret_cast<CMessageClass*>(cls);
 }
 
 static const Descriptor* GetMessageDescriptor(PyTypeObject* cls) {
-  PyMessageMeta* type = CheckMessageClass(cls);
+  CMessageClass* type = CheckMessageClass(cls);
   if (type == NULL) {
     return NULL;
   }
@@ -544,23 +573,10 @@
 
 // ---------------------------------------------------------------------
 
-// Constants used for integer type range checking.
-PyObject* kPythonZero;
-PyObject* kint32min_py;
-PyObject* kint32max_py;
-PyObject* kuint32max_py;
-PyObject* kint64min_py;
-PyObject* kint64max_py;
-PyObject* kuint64max_py;
-
 PyObject* EncodeError_class;
 PyObject* DecodeError_class;
 PyObject* PickleError_class;
 
-// Constant PyString values used for GetAttr/GetItem.
-static PyObject* k_cdescriptor;
-static PyObject* kfull_name;
-
 /* Is 64bit */
 void FormatTypeError(PyObject* arg, char* expected_types) {
   PyObject* repr = PyObject_Repr(arg);
@@ -574,68 +590,126 @@
   }
 }
 
+void OutOfRangeError(PyObject* arg) {
+  PyObject *s = PyObject_Str(arg);
+  if (s) {
+    PyErr_Format(PyExc_ValueError,
+                 "Value out of range: %s",
+                 PyString_AsString(s));
+    Py_DECREF(s);
+  }
+}
+
+template<class RangeType, class ValueType>
+bool VerifyIntegerCastAndRange(PyObject* arg, ValueType value) {
+  if (GOOGLE_PREDICT_FALSE(value == -1 && PyErr_Occurred())) {
+    if (PyErr_ExceptionMatches(PyExc_OverflowError)) {
+      // Replace it with the same ValueError as pure python protos instead of
+      // the default one.
+      PyErr_Clear();
+      OutOfRangeError(arg);
+    }  // Otherwise propagate existing error.
+    return false;
+    }
+    if (GOOGLE_PREDICT_FALSE(!IsValidNumericCast<RangeType>(value))) {
+      OutOfRangeError(arg);
+      return false;
+    }
+  return true;
+}
+
 template<class T>
-bool CheckAndGetInteger(
-    PyObject* arg, T* value, PyObject* min, PyObject* max) {
-  bool is_long = PyLong_Check(arg);
+bool CheckAndGetInteger(PyObject* arg, T* value) {
+  // The fast path.
 #if PY_MAJOR_VERSION < 3
-  if (!PyInt_Check(arg) && !is_long) {
-    FormatTypeError(arg, "int, long");
-    return false;
-  }
-  if (PyObject_Compare(min, arg) > 0 || PyObject_Compare(max, arg) < 0) {
-#else
-  if (!is_long) {
-    FormatTypeError(arg, "int");
-    return false;
-  }
-  if (PyObject_RichCompareBool(min, arg, Py_LE) != 1 ||
-      PyObject_RichCompareBool(max, arg, Py_GE) != 1) {
-#endif
-    if (!PyErr_Occurred()) {
-      PyObject *s = PyObject_Str(arg);
-      if (s) {
-        PyErr_Format(PyExc_ValueError,
-                     "Value out of range: %s",
-                     PyString_AsString(s));
-        Py_DECREF(s);
-      }
-    }
-    return false;
-  }
-#if PY_MAJOR_VERSION < 3
-  if (!is_long) {
-    *value = static_cast<T>(PyInt_AsLong(arg));
-  } else  // NOLINT
-#endif
-  {
-    if (min == kPythonZero) {
-      *value = static_cast<T>(PyLong_AsUnsignedLongLong(arg));
+  // For the typical case, offer a fast path.
+  if (GOOGLE_PREDICT_TRUE(PyInt_Check(arg))) {
+    long int_result = PyInt_AsLong(arg);
+    if (GOOGLE_PREDICT_TRUE(IsValidNumericCast<T>(int_result))) {
+      *value = static_cast<T>(int_result);
+      return true;
     } else {
-      *value = static_cast<T>(PyLong_AsLongLong(arg));
+      OutOfRangeError(arg);
+      return false;
+    }
+    }
+#endif
+  // This effectively defines an integer as "an object that can be cast as
+  // an integer and can be used as an ordinal number".
+  // This definition includes everything that implements numbers.Integral
+  // and shouldn't cast the net too wide.
+    if (GOOGLE_PREDICT_FALSE(!PyIndex_Check(arg))) {
+      FormatTypeError(arg, "int, long");
+      return false;
+    }
+
+  // Now we have an integral number so we can safely use PyLong_ functions.
+  // We need to treat the signed and unsigned cases differently in case arg is
+  // holding a value above the maximum for signed longs.
+  if (std::numeric_limits<T>::min() == 0) {
+    // Unsigned case.
+    unsigned PY_LONG_LONG ulong_result;
+    if (PyLong_Check(arg)) {
+      ulong_result = PyLong_AsUnsignedLongLong(arg);
+    } else {
+      // Unlike PyLong_AsLongLong, PyLong_AsUnsignedLongLong is very
+      // picky about the exact type.
+      PyObject* casted = PyNumber_Long(arg);
+      if (GOOGLE_PREDICT_FALSE(casted == nullptr)) {
+        // Propagate existing error.
+        return false;
+        }
+      ulong_result = PyLong_AsUnsignedLongLong(casted);
+      Py_DECREF(casted);
+    }
+    if (VerifyIntegerCastAndRange<T, unsigned PY_LONG_LONG>(arg,
+                                                            ulong_result)) {
+      *value = static_cast<T>(ulong_result);
+    } else {
+      return false;
+    }
+  } else {
+    // Signed case.
+    PY_LONG_LONG long_result;
+    PyNumberMethods *nb;
+    if ((nb = arg->ob_type->tp_as_number) != NULL && nb->nb_int != NULL) {
+      // PyLong_AsLongLong requires it to be a long or to have an __int__()
+      // method.
+      long_result = PyLong_AsLongLong(arg);
+    } else {
+      // Valid subclasses of numbers.Integral should have a __long__() method
+      // so fall back to that.
+      PyObject* casted = PyNumber_Long(arg);
+      if (GOOGLE_PREDICT_FALSE(casted == nullptr)) {
+        // Propagate existing error.
+        return false;
+        }
+      long_result = PyLong_AsLongLong(casted);
+      Py_DECREF(casted);
+    }
+    if (VerifyIntegerCastAndRange<T, PY_LONG_LONG>(arg, long_result)) {
+      *value = static_cast<T>(long_result);
+    } else {
+      return false;
     }
   }
+
   return true;
 }
 
 // These are referenced by repeated_scalar_container, and must
 // be explicitly instantiated.
-template bool CheckAndGetInteger<int32>(
-    PyObject*, int32*, PyObject*, PyObject*);
-template bool CheckAndGetInteger<int64>(
-    PyObject*, int64*, PyObject*, PyObject*);
-template bool CheckAndGetInteger<uint32>(
-    PyObject*, uint32*, PyObject*, PyObject*);
-template bool CheckAndGetInteger<uint64>(
-    PyObject*, uint64*, PyObject*, PyObject*);
+template bool CheckAndGetInteger<int32>(PyObject*, int32*);
+template bool CheckAndGetInteger<int64>(PyObject*, int64*);
+template bool CheckAndGetInteger<uint32>(PyObject*, uint32*);
+template bool CheckAndGetInteger<uint64>(PyObject*, uint64*);
 
 bool CheckAndGetDouble(PyObject* arg, double* value) {
-  if (!PyInt_Check(arg) && !PyLong_Check(arg) &&
-      !PyFloat_Check(arg)) {
+  *value = PyFloat_AsDouble(arg);
+  if (GOOGLE_PREDICT_FALSE(*value == -1 && PyErr_Occurred())) {
     FormatTypeError(arg, "int, long, float");
     return false;
-  }
-  *value = PyFloat_AsDouble(arg);
+    }
   return true;
 }
 
@@ -649,11 +723,13 @@
 }
 
 bool CheckAndGetBool(PyObject* arg, bool* value) {
-  if (!PyInt_Check(arg) && !PyBool_Check(arg) && !PyLong_Check(arg)) {
+  long long_value = PyInt_AsLong(arg);
+  if (long_value == -1 && PyErr_Occurred()) {
     FormatTypeError(arg, "int, long, bool");
     return false;
   }
-  *value = static_cast<bool>(PyInt_AsLong(arg));
+  *value = static_cast<bool>(long_value);
+
   return true;
 }
 
@@ -711,7 +787,7 @@
       encoded_string = arg;  // Already encoded.
       Py_INCREF(encoded_string);
     } else {
-      encoded_string = PyUnicode_AsEncodedObject(arg, "utf-8", NULL);
+      encoded_string = PyUnicode_AsEncodedString(arg, "utf-8", NULL);
     }
   } else {
     // In this case field type is "bytes".
@@ -751,7 +827,8 @@
   return true;
 }
 
-PyObject* ToStringObject(const FieldDescriptor* descriptor, string value) {
+PyObject* ToStringObject(const FieldDescriptor* descriptor,
+                         const string& value) {
   if (descriptor->type() != FieldDescriptor::TYPE_STRING) {
     return PyBytes_FromStringAndSize(value.c_str(), value.length());
   }
@@ -781,15 +858,9 @@
 
 namespace cmessage {
 
-PyDescriptorPool* GetDescriptorPoolForMessage(CMessage* message) {
-  // No need to check the type: the type of instances of CMessage is always
-  // an instance of PyMessageMeta. Let's prove it with a debug-only check.
+PyMessageFactory* GetFactoryForMessage(CMessage* message) {
   GOOGLE_DCHECK(PyObject_TypeCheck(message, &CMessage_Type));
-  return reinterpret_cast<PyMessageMeta*>(Py_TYPE(message))->py_descriptor_pool;
-}
-
-MessageFactory* GetFactoryForMessage(CMessage* message) {
-  return GetDescriptorPoolForMessage(message)->message_factory;
+  return reinterpret_cast<CMessageClass*>(Py_TYPE(message))->py_message_factory;
 }
 
 static int MaybeReleaseOverlappingOneofField(
@@ -842,7 +913,8 @@
     return NULL;
   }
   return reflection->MutableMessage(
-      parent_message, parent_field, GetFactoryForMessage(parent));
+      parent_message, parent_field,
+      GetFactoryForMessage(parent)->message_factory);
 }
 
 struct FixupMessageReference : public ChildVisitor {
@@ -990,8 +1062,31 @@
   int min, max;
   length = reflection->FieldSize(*message, field_descriptor);
 
-  if (PyInt_Check(slice) || PyLong_Check(slice)) {
+  if (PySlice_Check(slice)) {
+    from = to = step = slice_length = 0;
+#if PY_MAJOR_VERSION < 3
+    PySlice_GetIndicesEx(
+        reinterpret_cast<PySliceObject*>(slice),
+        length, &from, &to, &step, &slice_length);
+#else
+    PySlice_GetIndicesEx(
+        slice,
+        length, &from, &to, &step, &slice_length);
+#endif
+    if (from < to) {
+      min = from;
+      max = to - 1;
+    } else {
+      min = to + 1;
+      max = from;
+    }
+  } else {
     from = to = PyLong_AsLong(slice);
+    if (from == -1 && PyErr_Occurred()) {
+      PyErr_SetString(PyExc_TypeError, "list indices must be integers");
+      return -1;
+    }
+
     if (from < 0) {
       from = to = length + from;
     }
@@ -1003,25 +1098,6 @@
       PyErr_Format(PyExc_IndexError, "list assignment index out of range");
       return -1;
     }
-  } else if (PySlice_Check(slice)) {
-    from = to = step = slice_length = 0;
-    PySlice_GetIndicesEx(
-#if PY_MAJOR_VERSION < 3
-        reinterpret_cast<PySliceObject*>(slice),
-#else
-        slice,
-#endif
-        length, &from, &to, &step, &slice_length);
-    if (from < to) {
-      min = from;
-      max = to - 1;
-    } else {
-      min = to + 1;
-      max = from;
-    }
-  } else {
-    PyErr_SetString(PyExc_TypeError, "list indices must be integers");
-    return -1;
   }
 
   Py_ssize_t i = from;
@@ -1070,7 +1146,12 @@
 }
 
 // Initializes fields of a message. Used in constructors.
-int InitAttributes(CMessage* self, PyObject* kwargs) {
+int InitAttributes(CMessage* self, PyObject* args, PyObject* kwargs) {
+  if (args != NULL && PyTuple_Size(args) != 0) {
+    PyErr_SetString(PyExc_TypeError, "No positional arguments allowed");
+    return -1;
+  }
+
   if (kwargs == NULL) {
     return 0;
   }
@@ -1090,8 +1171,12 @@
                    PyString_AsString(name));
       return -1;
     }
+    if (value == Py_None) {
+      // field=None is the same as no field at all.
+      continue;
+    }
     if (descriptor->is_map()) {
-      ScopedPyObjectPtr map(GetAttr(self, name));
+      ScopedPyObjectPtr map(GetAttr(reinterpret_cast<PyObject*>(self), name));
       const FieldDescriptor* value_descriptor =
           descriptor->message_type()->FindFieldByName("value");
       if (value_descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
@@ -1119,7 +1204,8 @@
         }
       }
     } else if (descriptor->label() == FieldDescriptor::LABEL_REPEATED) {
-      ScopedPyObjectPtr container(GetAttr(self, name));
+      ScopedPyObjectPtr container(
+          GetAttr(reinterpret_cast<PyObject*>(self), name));
       if (container == NULL) {
         return -1;
       }
@@ -1186,13 +1272,16 @@
         }
       }
     } else if (descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-      ScopedPyObjectPtr message(GetAttr(self, name));
+      ScopedPyObjectPtr message(
+          GetAttr(reinterpret_cast<PyObject*>(self), name));
       if (message == NULL) {
         return -1;
       }
       CMessage* cmessage = reinterpret_cast<CMessage*>(message.get());
       if (PyDict_Check(value)) {
-        if (InitAttributes(cmessage, value) < 0) {
+        // Make the message exist even if the dict is empty.
+        AssureWritable(cmessage);
+        if (InitAttributes(cmessage, NULL, value) < 0) {
           return -1;
         }
       } else {
@@ -1209,8 +1298,8 @@
           return -1;
         }
       }
-      if (SetAttr(self, name, (new_val.get() == NULL) ? value : new_val.get()) <
-          0) {
+      if (SetAttr(reinterpret_cast<PyObject*>(self), name,
+                  (new_val.get() == NULL) ? value : new_val.get()) < 0) {
         return -1;
       }
     }
@@ -1220,13 +1309,15 @@
 
 // Allocates an incomplete Python Message: the caller must fill self->message,
 // self->owner and eventually self->parent.
-CMessage* NewEmptyMessage(PyObject* type, const Descriptor *descriptor) {
+CMessage* NewEmptyMessage(CMessageClass* type) {
   CMessage* self = reinterpret_cast<CMessage*>(
-      PyType_GenericAlloc(reinterpret_cast<PyTypeObject*>(type), 0));
+      PyType_GenericAlloc(&type->super.ht_type, 0));
   if (self == NULL) {
     return NULL;
   }
 
+  // Use "placement new" syntax to initialize the C++ object.
+  new (&self->owner) CMessage::OwnerRef(NULL);
   self->message = NULL;
   self->parent = NULL;
   self->parent_field_descriptor = NULL;
@@ -1242,7 +1333,7 @@
 // Creates a new C++ message and takes ownership.
 static PyObject* New(PyTypeObject* cls,
                      PyObject* unused_args, PyObject* unused_kwargs) {
-  PyMessageMeta* type = CheckMessageClass(cls);
+  CMessageClass* type = CheckMessageClass(cls);
   if (type == NULL) {
     return NULL;
   }
@@ -1251,15 +1342,14 @@
   if (message_descriptor == NULL) {
     return NULL;
   }
-  const Message* default_message = type->py_descriptor_pool->message_factory
+  const Message* default_message = type->py_message_factory->message_factory
                                    ->GetPrototype(message_descriptor);
   if (default_message == NULL) {
     PyErr_SetString(PyExc_TypeError, message_descriptor->full_name().c_str());
     return NULL;
   }
 
-  CMessage* self = NewEmptyMessage(reinterpret_cast<PyObject*>(type),
-                                   message_descriptor);
+  CMessage* self = NewEmptyMessage(type);
   if (self == NULL) {
     return NULL;
   }
@@ -1271,12 +1361,7 @@
 // The __init__ method of Message classes.
 // It initializes fields from keywords passed to the constructor.
 static int Init(CMessage* self, PyObject* args, PyObject* kwargs) {
-  if (PyTuple_Size(args) != 0) {
-    PyErr_SetString(PyExc_TypeError, "No positional arguments allowed");
-    return -1;
-  }
-
-  return InitAttributes(self, kwargs);
+  return InitAttributes(self, args, kwargs);
 }
 
 // ---------------------------------------------------------------------
@@ -1318,6 +1403,9 @@
 };
 
 static void Dealloc(CMessage* self) {
+  if (self->weakreflist) {
+    PyObject_ClearWeakRefs(reinterpret_cast<PyObject*>(self));
+  }
   // Null out all weak references from children to this message.
   GOOGLE_CHECK_EQ(0, ForEachCompositeField(self, ClearWeakReferences()));
   if (self->extensions) {
@@ -1326,7 +1414,7 @@
 
   Py_CLEAR(self->extensions);
   Py_CLEAR(self->composite_fields);
-  self->owner.reset();
+  self->owner.~ThreadUnsafeSharedPtr<Message>();
   Py_TYPE(self)->tp_free(reinterpret_cast<PyObject*>(self));
 }
 
@@ -1467,36 +1555,25 @@
   if (message->GetReflection()->HasField(*message, field_descriptor)) {
     Py_RETURN_TRUE;
   }
-  if (!message->GetReflection()->SupportsUnknownEnumValues() &&
-      field_descriptor->cpp_type() == FieldDescriptor::CPPTYPE_ENUM) {
-    // Special case: Python HasField() differs in semantics from C++
-    // slightly: we return HasField('enum_field') == true if there is
-    // an unknown enum value present. To implement this we have to
-    // look in the UnknownFieldSet.
-    const UnknownFieldSet& unknown_field_set =
-        message->GetReflection()->GetUnknownFields(*message);
-    for (int i = 0; i < unknown_field_set.field_count(); ++i) {
-      if (unknown_field_set.field(i).number() == field_descriptor->number()) {
-        Py_RETURN_TRUE;
-      }
-    }
-  }
+
   Py_RETURN_FALSE;
 }
 
 PyObject* ClearExtension(CMessage* self, PyObject* extension) {
+  const FieldDescriptor* descriptor = GetExtensionDescriptor(extension);
+  if (descriptor == NULL) {
+    return NULL;
+  }
   if (self->extensions != NULL) {
-    return extension_dict::ClearExtension(self->extensions, extension);
-  } else {
-    const FieldDescriptor* descriptor = GetExtensionDescriptor(extension);
-    if (descriptor == NULL) {
-      return NULL;
-    }
-    if (ScopedPyObjectPtr(ClearFieldByDescriptor(self, descriptor)) == NULL) {
-      return NULL;
+    PyObject* value = PyDict_GetItem(self->extensions->values, extension);
+    if (value != NULL) {
+      if (InternalReleaseFieldByDescriptor(self, descriptor, value) < 0) {
+        return NULL;
+      }
+      PyDict_DelItem(self->extensions->values, extension);
     }
   }
-  Py_RETURN_NONE;
+  return ClearFieldByDescriptor(self, descriptor);
 }
 
 PyObject* HasExtension(CMessage* self, PyObject* extension) {
@@ -1539,9 +1616,10 @@
 // * Clear the weak references from the released container to the
 //   parent.
 
-struct SetOwnerVisitor : public ChildVisitor {
+class SetOwnerVisitor : public ChildVisitor {
+ public:
   // new_owner must outlive this object.
-  explicit SetOwnerVisitor(const shared_ptr<Message>& new_owner)
+  explicit SetOwnerVisitor(const CMessage::OwnerRef& new_owner)
       : new_owner_(new_owner) {}
 
   int VisitRepeatedCompositeContainer(RepeatedCompositeContainer* container) {
@@ -1565,11 +1643,11 @@
   }
 
  private:
-  const shared_ptr<Message>& new_owner_;
+  const CMessage::OwnerRef& new_owner_;
 };
 
 // Change the owner of this CMessage and all its children, recursively.
-int SetOwner(CMessage* self, const shared_ptr<Message>& new_owner) {
+int SetOwner(CMessage* self, const CMessage::OwnerRef& new_owner) {
   self->owner = new_owner;
   if (ForEachCompositeField(self, SetOwnerVisitor(new_owner)) == -1)
     return -1;
@@ -1582,7 +1660,7 @@
 Message* ReleaseMessage(CMessage* self,
                         const Descriptor* descriptor,
                         const FieldDescriptor* field_descriptor) {
-  MessageFactory* message_factory = GetFactoryForMessage(self);
+  MessageFactory* message_factory = GetFactoryForMessage(self)->message_factory;
   Message* released_message = self->message->GetReflection()->ReleaseMessage(
       self->message, field_descriptor, message_factory);
   // ReleaseMessage will return NULL which differs from
@@ -1602,7 +1680,7 @@
                       const FieldDescriptor* field_descriptor,
                       CMessage* child_cmessage) {
   // Release the Message
-  shared_ptr<Message> released_message(ReleaseMessage(
+  CMessage::OwnerRef released_message(ReleaseMessage(
       self, child_cmessage->message->GetDescriptor(), field_descriptor));
   child_cmessage->message = released_message.get();
   child_cmessage->owner.swap(released_message);
@@ -1619,23 +1697,20 @@
       parent_(parent) {}
 
   int VisitRepeatedCompositeContainer(RepeatedCompositeContainer* container) {
-    return repeated_composite_container::Release(
-        reinterpret_cast<RepeatedCompositeContainer*>(container));
+    return repeated_composite_container::Release(container);
   }
 
   int VisitRepeatedScalarContainer(RepeatedScalarContainer* container) {
-    return repeated_scalar_container::Release(
-        reinterpret_cast<RepeatedScalarContainer*>(container));
+    return repeated_scalar_container::Release(container);
   }
 
   int VisitMapContainer(MapContainer* container) {
-    return reinterpret_cast<MapContainer*>(container)->Release();
+    return container->Release();
   }
 
   int VisitCMessage(CMessage* cmessage,
                     const FieldDescriptor* field_descriptor) {
-    return ReleaseSubMessage(parent_, field_descriptor,
-        reinterpret_cast<CMessage*>(cmessage));
+    return ReleaseSubMessage(parent_, field_descriptor, cmessage);
   }
 
   CMessage* parent_;
@@ -1653,12 +1728,13 @@
 
 PyObject* ClearFieldByDescriptor(
     CMessage* self,
-    const FieldDescriptor* descriptor) {
-  if (!CheckFieldBelongsToMessage(descriptor, self->message)) {
+    const FieldDescriptor* field_descriptor) {
+  if (!CheckFieldBelongsToMessage(field_descriptor, self->message)) {
     return NULL;
   }
   AssureWritable(self);
-  self->message->GetReflection()->ClearField(self->message, descriptor);
+  Message* message = self->message;
+  message->GetReflection()->ClearField(message, field_descriptor);
   Py_RETURN_NONE;
 }
 
@@ -1694,27 +1770,17 @@
     arg = arg_in_oneof.get();
   }
 
-  PyObject* composite_field = self->composite_fields ?
-      PyDict_GetItem(self->composite_fields, arg) : NULL;
-
-  // Only release the field if there's a possibility that there are
-  // references to it.
-  if (composite_field != NULL) {
-    if (InternalReleaseFieldByDescriptor(self, field_descriptor,
-                                         composite_field) < 0) {
-      return NULL;
+  // Release the field if it exists in the dict of composite fields.
+  if (self->composite_fields) {
+    PyObject* value = PyDict_GetItem(self->composite_fields, arg);
+    if (value != NULL) {
+      if (InternalReleaseFieldByDescriptor(self, field_descriptor, value) < 0) {
+        return NULL;
+      }
+      PyDict_DelItem(self->composite_fields, arg);
     }
-    PyDict_DelItem(self->composite_fields, arg);
   }
-  message->GetReflection()->ClearField(message, field_descriptor);
-  if (field_descriptor->cpp_type() == FieldDescriptor::CPPTYPE_ENUM &&
-      !message->GetReflection()->SupportsUnknownEnumValues()) {
-    UnknownFieldSet* unknown_field_set =
-        message->GetReflection()->MutableUnknownFields(message);
-    unknown_field_set->DeleteByNumber(field_descriptor->number());
-  }
-
-  Py_RETURN_NONE;
+  return ClearFieldByDescriptor(self, field_descriptor);
 }
 
 PyObject* Clear(CMessage* self) {
@@ -1739,8 +1805,25 @@
   }
 }
 
-static PyObject* SerializeToString(CMessage* self, PyObject* args) {
-  if (!self->message->IsInitialized()) {
+static PyObject* InternalSerializeToString(
+    CMessage* self, PyObject* args, PyObject* kwargs,
+    bool require_initialized) {
+  // Parse the "deterministic" kwarg; defaults to False.
+  static char* kwlist[] = { "deterministic", 0 };
+  PyObject* deterministic_obj = Py_None;
+  if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|O", kwlist,
+                                   &deterministic_obj)) {
+    return NULL;
+  }
+  // Preemptively convert to a bool first, so we don't need to back out of
+  // allocating memory if this raises an exception.
+  // NOTE: This is unused later if deterministic == Py_None, but that's fine.
+  int deterministic = PyObject_IsTrue(deterministic_obj);
+  if (deterministic < 0) {
+    return NULL;
+  }
+
+  if (require_initialized && !self->message->IsInitialized()) {
     ScopedPyObjectPtr errors(FindInitializationErrors(self));
     if (errors == NULL) {
       return NULL;
@@ -1778,24 +1861,36 @@
                  GetMessageName(self).c_str(), PyString_AsString(joined.get()));
     return NULL;
   }
-  int size = self->message->ByteSize();
-  if (size <= 0) {
+
+  // Ok, arguments parsed and errors checked, now encode to a string
+  const size_t size = self->message->ByteSizeLong();
+  if (size == 0) {
     return PyBytes_FromString("");
   }
   PyObject* result = PyBytes_FromStringAndSize(NULL, size);
   if (result == NULL) {
     return NULL;
   }
-  char* buffer = PyBytes_AS_STRING(result);
-  self->message->SerializeWithCachedSizesToArray(
-      reinterpret_cast<uint8*>(buffer));
+  io::ArrayOutputStream out(PyBytes_AS_STRING(result), size);
+  io::CodedOutputStream coded_out(&out);
+  if (deterministic_obj != Py_None) {
+    coded_out.SetSerializationDeterministic(deterministic);
+  }
+  self->message->SerializeWithCachedSizes(&coded_out);
+  GOOGLE_CHECK(!coded_out.HadError());
   return result;
 }
 
-static PyObject* SerializePartialToString(CMessage* self) {
-  string contents;
-  self->message->SerializePartialToString(&contents);
-  return PyBytes_FromStringAndSize(contents.c_str(), contents.size());
+static PyObject* SerializeToString(
+    CMessage* self, PyObject* args, PyObject* kwargs) {
+  return InternalSerializeToString(self, args, kwargs,
+                                   /*require_initialized=*/true);
+}
+
+static PyObject* SerializePartialToString(
+    CMessage* self, PyObject* args, PyObject* kwargs) {
+  return InternalSerializeToString(self, args, kwargs,
+                                   /*require_initialized=*/false);
 }
 
 // Formats proto fields for ascii dumps using python formatting functions where
@@ -1851,8 +1946,12 @@
 
 PyObject* MergeFrom(CMessage* self, PyObject* arg) {
   CMessage* other_message;
-  if (!PyObject_TypeCheck(reinterpret_cast<PyObject *>(arg), &CMessage_Type)) {
-    PyErr_SetString(PyExc_TypeError, "Must be a message");
+  if (!PyObject_TypeCheck(arg, &CMessage_Type)) {
+    PyErr_Format(PyExc_TypeError,
+                 "Parameter to MergeFrom() must be instance of same class: "
+                 "expected %s got %s.",
+                 self->message->GetDescriptor()->full_name().c_str(),
+                 Py_TYPE(arg)->tp_name);
     return NULL;
   }
 
@@ -1860,8 +1959,8 @@
   if (other_message->message->GetDescriptor() !=
       self->message->GetDescriptor()) {
     PyErr_Format(PyExc_TypeError,
-                 "Tried to merge from a message with a different type. "
-                 "to: %s, from: %s",
+                 "Parameter to MergeFrom() must be instance of same class: "
+                 "expected %s got %s.",
                  self->message->GetDescriptor()->full_name().c_str(),
                  other_message->message->GetDescriptor()->full_name().c_str());
     return NULL;
@@ -1879,8 +1978,12 @@
 
 static PyObject* CopyFrom(CMessage* self, PyObject* arg) {
   CMessage* other_message;
-  if (!PyObject_TypeCheck(reinterpret_cast<PyObject *>(arg), &CMessage_Type)) {
-    PyErr_SetString(PyExc_TypeError, "Must be a message");
+  if (!PyObject_TypeCheck(arg, &CMessage_Type)) {
+    PyErr_Format(PyExc_TypeError,
+                 "Parameter to CopyFrom() must be instance of same class: "
+                 "expected %s got %s.",
+                 self->message->GetDescriptor()->full_name().c_str(),
+                 Py_TYPE(arg)->tp_name);
     return NULL;
   }
 
@@ -1893,8 +1996,8 @@
   if (other_message->message->GetDescriptor() !=
       self->message->GetDescriptor()) {
     PyErr_Format(PyExc_TypeError,
-                 "Tried to copy from a message with a different type. "
-                 "to: %s, from: %s",
+                 "Parameter to CopyFrom() must be instance of same class: "
+                 "expected %s got %s.",
                  self->message->GetDescriptor()->full_name().c_str(),
                  other_message->message->GetDescriptor()->full_name().c_str());
     return NULL;
@@ -1911,6 +2014,34 @@
   Py_RETURN_NONE;
 }
 
+// Protobuf has a 64MB limit built in, this variable will override this. Please
+// do not enable this unless you fully understand the implications: protobufs
+// must all be kept in memory at the same time, so if they grow too big you may
+// get OOM errors. The protobuf APIs do not provide any tools for processing
+// protobufs in chunks.  If you have protos this big you should break them up if
+// it is at all convenient to do so.
+#ifdef PROTOBUF_PYTHON_ALLOW_OVERSIZE_PROTOS
+static bool allow_oversize_protos = true;
+#else
+static bool allow_oversize_protos = false;
+#endif
+
+// Provide a method in the module to set allow_oversize_protos to a boolean
+// value. This method returns the newly value of allow_oversize_protos.
+PyObject* SetAllowOversizeProtos(PyObject* m, PyObject* arg) {
+  if (!arg || !PyBool_Check(arg)) {
+    PyErr_SetString(PyExc_TypeError,
+                    "Argument to SetAllowOversizeProtos must be boolean");
+    return NULL;
+  }
+  allow_oversize_protos = PyObject_IsTrue(arg);
+  if (allow_oversize_protos) {
+    Py_RETURN_TRUE;
+  } else {
+    Py_RETURN_FALSE;
+  }
+}
+
 static PyObject* MergeFromString(CMessage* self, PyObject* arg) {
   const void* data;
   Py_ssize_t data_length;
@@ -1921,10 +2052,18 @@
   AssureWritable(self);
   io::CodedInputStream input(
       reinterpret_cast<const uint8*>(data), data_length);
-  PyDescriptorPool* pool = GetDescriptorPoolForMessage(self);
-  input.SetExtensionRegistry(pool->pool, pool->message_factory);
+  if (allow_oversize_protos) {
+    input.SetTotalBytesLimit(INT_MAX, INT_MAX);
+  }
+  PyMessageFactory* factory = GetFactoryForMessage(self);
+  input.SetExtensionRegistry(factory->pool->pool, factory->message_factory);
   bool success = self->message->MergePartialFromCodedStream(&input);
   if (success) {
+    if (!input.ConsumedEntireMessage()) {
+      // TODO(jieluo): Raise error and return NULL instead.
+      // b/27494216
+      PyErr_Warn(NULL, "Unexpected end-group tag: Not all data was converted");
+    }
     return PyInt_FromLong(input.CurrentPosition());
   } else {
     PyErr_Format(DecodeError_class, "Error parsing message");
@@ -1943,75 +2082,29 @@
   return PyLong_FromLong(self->message->ByteSize());
 }
 
-static PyObject* RegisterExtension(PyObject* cls,
-                                   PyObject* extension_handle) {
+PyObject* RegisterExtension(PyObject* cls, PyObject* extension_handle) {
   const FieldDescriptor* descriptor =
       GetExtensionDescriptor(extension_handle);
   if (descriptor == NULL) {
     return NULL;
   }
-
-  ScopedPyObjectPtr extensions_by_name(
-      PyObject_GetAttr(cls, k_extensions_by_name));
-  if (extensions_by_name == NULL) {
-    PyErr_SetString(PyExc_TypeError, "no extensions_by_name on class");
+  if (!PyObject_TypeCheck(cls, &CMessageClass_Type)) {
+    PyErr_Format(PyExc_TypeError, "Expected a message class, got %s",
+                 cls->ob_type->tp_name);
     return NULL;
   }
-  ScopedPyObjectPtr full_name(PyObject_GetAttr(extension_handle, kfull_name));
-  if (full_name == NULL) {
+  CMessageClass *message_class = reinterpret_cast<CMessageClass*>(cls);
+  if (message_class == NULL) {
     return NULL;
   }
-
   // If the extension was already registered, check that it is the same.
-  PyObject* existing_extension =
-      PyDict_GetItem(extensions_by_name.get(), full_name.get());
-  if (existing_extension != NULL) {
-    const FieldDescriptor* existing_extension_descriptor =
-        GetExtensionDescriptor(existing_extension);
-    if (existing_extension_descriptor != descriptor) {
-      PyErr_SetString(PyExc_ValueError, "Double registration of Extensions");
-      return NULL;
-    }
-    // Nothing else to do.
-    Py_RETURN_NONE;
-  }
-
-  if (PyDict_SetItem(extensions_by_name.get(), full_name.get(),
-                     extension_handle) < 0) {
+  const FieldDescriptor* existing_extension =
+      message_class->py_message_factory->pool->pool->FindExtensionByNumber(
+          descriptor->containing_type(), descriptor->number());
+  if (existing_extension != NULL && existing_extension != descriptor) {
+    PyErr_SetString(PyExc_ValueError, "Double registration of Extensions");
     return NULL;
   }
-
-  // Also store a mapping from extension number to implementing class.
-  ScopedPyObjectPtr extensions_by_number(
-      PyObject_GetAttr(cls, k_extensions_by_number));
-  if (extensions_by_number == NULL) {
-    PyErr_SetString(PyExc_TypeError, "no extensions_by_number on class");
-    return NULL;
-  }
-  ScopedPyObjectPtr number(PyObject_GetAttrString(extension_handle, "number"));
-  if (number == NULL) {
-    return NULL;
-  }
-  if (PyDict_SetItem(extensions_by_number.get(), number.get(),
-                     extension_handle) < 0) {
-    return NULL;
-  }
-
-  // Check if it's a message set
-  if (descriptor->is_extension() &&
-      descriptor->containing_type()->options().message_set_wire_format() &&
-      descriptor->type() == FieldDescriptor::TYPE_MESSAGE &&
-      descriptor->label() == FieldDescriptor::LABEL_OPTIONAL) {
-    ScopedPyObjectPtr message_name(PyString_FromStringAndSize(
-        descriptor->message_type()->full_name().c_str(),
-        descriptor->message_type()->full_name().size()));
-    if (message_name == NULL) {
-      return NULL;
-    }
-    PyDict_SetItem(extensions_by_name.get(), message_name.get(),
-                   extension_handle);
-  }
-
   Py_RETURN_NONE;
 }
 
@@ -2048,7 +2141,7 @@
 static PyObject* GetExtensionDict(CMessage* self, void *closure);
 
 static PyObject* ListFields(CMessage* self) {
-  vector<const FieldDescriptor*> fields;
+  std::vector<const FieldDescriptor*> fields;
   self->message->GetReflection()->ListFields(*self->message, &fields);
 
   // Normally, the list will be exactly the size of the fields.
@@ -2078,8 +2171,8 @@
       // is no message class and we cannot retrieve the value.
       // TODO(amauryfa): consider building the class on the fly!
       if (fields[i]->message_type() != NULL &&
-          cdescriptor_pool::GetMessageClass(
-              GetDescriptorPoolForMessage(self),
+          message_factory::GetMessageClass(
+              GetFactoryForMessage(self),
               fields[i]->message_type()) == NULL) {
         PyErr_Clear();
         continue;
@@ -2112,7 +2205,8 @@
         return NULL;
       }
 
-      PyObject* field_value = GetAttr(self, py_field_name.get());
+      PyObject* field_value =
+          GetAttr(reinterpret_cast<PyObject*>(self), py_field_name.get());
       if (field_value == NULL) {
         PyErr_SetObject(PyExc_ValueError, py_field_name.get());
         return NULL;
@@ -2123,13 +2217,23 @@
     PyList_SET_ITEM(all_fields.get(), actual_size, t.release());
     ++actual_size;
   }
-  Py_SIZE(all_fields.get()) = actual_size;
+  if (static_cast<size_t>(actual_size) != fields.size() &&
+      (PyList_SetSlice(all_fields.get(), actual_size, fields.size(), NULL) <
+       0)) {
+    return NULL;
+  }
   return all_fields.release();
 }
 
+static PyObject* DiscardUnknownFields(CMessage* self) {
+  AssureWritable(self);
+  self->message->DiscardUnknownFields();
+  Py_RETURN_NONE;
+}
+
 PyObject* FindInitializationErrors(CMessage* self) {
   Message* message = self->message;
-  vector<string> errors;
+  std::vector<string> errors;
   message->FindInitializationErrors(&errors);
 
   PyObject* error_list = PyList_New(errors.size());
@@ -2226,32 +2330,16 @@
       break;
     }
     case FieldDescriptor::CPPTYPE_STRING: {
-      string value = reflection->GetString(*message, field_descriptor);
+      string scratch;
+      const string& value =
+          reflection->GetStringReference(*message, field_descriptor, &scratch);
       result = ToStringObject(field_descriptor, value);
       break;
     }
     case FieldDescriptor::CPPTYPE_ENUM: {
-      if (!message->GetReflection()->SupportsUnknownEnumValues() &&
-          !message->GetReflection()->HasField(*message, field_descriptor)) {
-        // Look for the value in the unknown fields.
-        const UnknownFieldSet& unknown_field_set =
-            message->GetReflection()->GetUnknownFields(*message);
-        for (int i = 0; i < unknown_field_set.field_count(); ++i) {
-          if (unknown_field_set.field(i).number() ==
-              field_descriptor->number() &&
-              unknown_field_set.field(i).type() ==
-              google::protobuf::UnknownField::TYPE_VARINT) {
-            result = PyInt_FromLong(unknown_field_set.field(i).varint());
-            break;
-          }
-        }
-      }
-
-      if (result == NULL) {
-        const EnumValueDescriptor* enum_value =
-            message->GetReflection()->GetEnum(*message, field_descriptor);
-        result = PyInt_FromLong(enum_value->number());
-      }
+      const EnumValueDescriptor* enum_value =
+          message->GetReflection()->GetEnum(*message, field_descriptor);
+      result = PyInt_FromLong(enum_value->number());
       break;
     }
     default:
@@ -2266,18 +2354,19 @@
 PyObject* InternalGetSubMessage(
     CMessage* self, const FieldDescriptor* field_descriptor) {
   const Reflection* reflection = self->message->GetReflection();
-  PyDescriptorPool* pool = GetDescriptorPoolForMessage(self);
+  PyMessageFactory* factory = GetFactoryForMessage(self);
   const Message& sub_message = reflection->GetMessage(
-      *self->message, field_descriptor, pool->message_factory);
+      *self->message, field_descriptor, factory->message_factory);
 
-  PyObject *message_class = cdescriptor_pool::GetMessageClass(
-      pool, field_descriptor->message_type());
+  CMessageClass* message_class = message_factory::GetOrCreateMessageClass(
+      factory, field_descriptor->message_type());
+  ScopedPyObjectPtr message_class_handler(
+      reinterpret_cast<PyObject*>(message_class));
   if (message_class == NULL) {
     return NULL;
   }
 
-  CMessage* cmsg = cmessage::NewEmptyMessage(message_class,
-                                             sub_message.GetDescriptor());
+  CMessage* cmsg = cmessage::NewEmptyMessage(message_class);
   if (cmsg == NULL) {
     return NULL;
   }
@@ -2462,7 +2551,10 @@
   if (state == NULL) {
     return  NULL;
   }
-  ScopedPyObjectPtr serialized(SerializePartialToString(self));
+  string contents;
+  self->message->SerializePartialToString(&contents);
+  ScopedPyObjectPtr serialized(
+      PyBytes_FromStringAndSize(contents.c_str(), contents.size()));
   if (serialized == NULL) {
     return NULL;
   }
@@ -2522,11 +2614,24 @@
   return NULL;
 }
 
+static PyObject* GetExtensionsByName(CMessage *self, void *closure) {
+  return message_meta::GetExtensionsByName(
+      reinterpret_cast<CMessageClass*>(Py_TYPE(self)), closure);
+}
+
+static PyObject* GetExtensionsByNumber(CMessage *self, void *closure) {
+  return message_meta::GetExtensionsByNumber(
+      reinterpret_cast<CMessageClass*>(Py_TYPE(self)), closure);
+}
+
 static PyGetSetDef Getters[] = {
   {"Extensions", (getter)GetExtensionDict, NULL, "Extension dict"},
+  {"_extensions_by_name", (getter)GetExtensionsByName, NULL},
+  {"_extensions_by_number", (getter)GetExtensionsByNumber, NULL},
   {NULL}
 };
 
+
 static PyMethodDef Methods[] = {
   { "__deepcopy__", (PyCFunction)DeepCopy, METH_VARARGS,
     "Makes a deep copy of the class." },
@@ -2546,6 +2651,8 @@
     "Clears a message field." },
   { "CopyFrom", (PyCFunction)CopyFrom, METH_O,
     "Copies a protocol message into the current message." },
+  { "DiscardUnknownFields", (PyCFunction)DiscardUnknownFields, METH_NOARGS,
+    "Discards the unknown fields." },
   { "FindInitializationErrors", (PyCFunction)FindInitializationErrors,
     METH_NOARGS,
     "Finds unset required fields." },
@@ -2568,9 +2675,10 @@
   { "RegisterExtension", (PyCFunction)RegisterExtension, METH_O | METH_CLASS,
     "Registers an extension with the current message." },
   { "SerializePartialToString", (PyCFunction)SerializePartialToString,
-    METH_NOARGS,
+    METH_VARARGS | METH_KEYWORDS,
     "Serializes the message to a string, even if it isn't initialized." },
-  { "SerializeToString", (PyCFunction)SerializeToString, METH_NOARGS,
+  { "SerializeToString", (PyCFunction)SerializeToString,
+    METH_VARARGS | METH_KEYWORDS,
     "Serializes the message to a string, only for initialized messages." },
   { "SetInParent", (PyCFunction)SetInParent, METH_NOARGS,
     "Sets the has bit of the given field in its parent message." },
@@ -2596,7 +2704,8 @@
   return PyDict_SetItem(self->composite_fields, name, value) == 0;
 }
 
-PyObject* GetAttr(CMessage* self, PyObject* name) {
+PyObject* GetAttr(PyObject* pself, PyObject* name) {
+  CMessage* self = reinterpret_cast<CMessage*>(pself);
   PyObject* value = self->composite_fields ?
       PyDict_GetItem(self->composite_fields, name) : NULL;
   if (value != NULL) {
@@ -2615,8 +2724,8 @@
     const Descriptor* entry_type = field_descriptor->message_type();
     const FieldDescriptor* value_type = entry_type->FindFieldByName("value");
     if (value_type->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-      PyObject* value_class = cdescriptor_pool::GetMessageClass(
-          GetDescriptorPoolForMessage(self), value_type->message_type());
+      CMessageClass* value_class = message_factory::GetMessageClass(
+          GetFactoryForMessage(self), value_type->message_type());
       if (value_class == NULL) {
         return NULL;
       }
@@ -2638,8 +2747,8 @@
   if (field_descriptor->label() == FieldDescriptor::LABEL_REPEATED) {
     PyObject* py_container = NULL;
     if (field_descriptor->cpp_type() == FieldDescriptor::CPPTYPE_MESSAGE) {
-      PyObject *message_class = cdescriptor_pool::GetMessageClass(
-          GetDescriptorPoolForMessage(self), field_descriptor->message_type());
+      CMessageClass* message_class = message_factory::GetMessageClass(
+          GetFactoryForMessage(self), field_descriptor->message_type());
       if (message_class == NULL) {
         return NULL;
       }
@@ -2674,7 +2783,8 @@
   return InternalGetScalar(self->message, field_descriptor);
 }
 
-int SetAttr(CMessage* self, PyObject* name, PyObject* value) {
+int SetAttr(PyObject* pself, PyObject* name, PyObject* value) {
+  CMessage* self = reinterpret_cast<CMessage*>(pself);
   if (self->composite_fields && PyDict_Contains(self->composite_fields, name)) {
     PyErr_SetString(PyExc_TypeError, "Can't set composite field");
     return -1;
@@ -2702,7 +2812,7 @@
 
   PyErr_Format(PyExc_AttributeError,
                "Assignment not allowed "
-               "(no field \"%s\"in protocol message object).",
+               "(no field \"%s\" in protocol message object).",
                PyString_AsString(name));
   return -1;
 }
@@ -2710,7 +2820,7 @@
 }  // namespace cmessage
 
 PyTypeObject CMessage_Type = {
-  PyVarObject_HEAD_INIT(&PyMessageMeta_Type, 0)
+  PyVarObject_HEAD_INIT(&CMessageClass_Type, 0)
   FULL_MODULE_NAME ".CMessage",        // tp_name
   sizeof(CMessage),                    // tp_basicsize
   0,                                   //  tp_itemsize
@@ -2719,22 +2829,22 @@
   0,                                   //  tp_getattr
   0,                                   //  tp_setattr
   0,                                   //  tp_compare
-  0,                                   //  tp_repr
+  (reprfunc)cmessage::ToStr,           //  tp_repr
   0,                                   //  tp_as_number
   0,                                   //  tp_as_sequence
   0,                                   //  tp_as_mapping
   PyObject_HashNotImplemented,         //  tp_hash
   0,                                   //  tp_call
   (reprfunc)cmessage::ToStr,           //  tp_str
-  (getattrofunc)cmessage::GetAttr,     //  tp_getattro
-  (setattrofunc)cmessage::SetAttr,     //  tp_setattro
+  cmessage::GetAttr,                   //  tp_getattro
+  cmessage::SetAttr,                   //  tp_setattro
   0,                                   //  tp_as_buffer
   Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE,  //  tp_flags
   "A ProtocolMessage",                 //  tp_doc
   0,                                   //  tp_traverse
   0,                                   //  tp_clear
   (richcmpfunc)cmessage::RichCompare,  //  tp_richcompare
-  0,                                   //  tp_weaklistoffset
+  offsetof(CMessage, weakreflist),     //  tp_weaklistoffset
   0,                                   //  tp_iter
   0,                                   //  tp_iternext
   cmessage::Methods,                   //  tp_methods
@@ -2781,30 +2891,11 @@
   return cmsg->message;
 }
 
-static const char module_docstring[] =
-"python-proto2 is a module that can be used to enhance proto2 Python API\n"
-"performance.\n"
-"\n"
-"It provides access to the protocol buffers C++ reflection API that\n"
-"implements the basic protocol buffer functions.";
-
 void InitGlobals() {
   // TODO(gps): Check all return values in this function for NULL and propagate
   // the error (MemoryError) on up to result in an import failure.  These should
   // also be freed and reset to NULL during finalization.
-  kPythonZero = PyInt_FromLong(0);
-  kint32min_py = PyInt_FromLong(kint32min);
-  kint32max_py = PyInt_FromLong(kint32max);
-  kuint32max_py = PyLong_FromLongLong(kuint32max);
-  kint64min_py = PyLong_FromLongLong(kint64min);
-  kint64max_py = PyLong_FromLongLong(kint64max);
-  kuint64max_py = PyLong_FromUnsignedLongLong(kuint64max);
-
   kDESCRIPTOR = PyString_FromString("DESCRIPTOR");
-  k_cdescriptor = PyString_FromString("_cdescriptor");
-  kfull_name = PyString_FromString("full_name");
-  k_extensions_by_name = PyString_FromString("_extensions_by_name");
-  k_extensions_by_number = PyString_FromString("_extensions_by_number");
 
   PyObject *dummy_obj = PySet_New(NULL);
   kEmptyWeakref = PyWeakref_NewRef(dummy_obj, NULL);
@@ -2822,15 +2913,20 @@
     return false;
   }
 
+  // Initialize types and globals in message_factory.cc
+  if (!InitMessageFactory()) {
+    return false;
+  }
+
   // Initialize constants defined in this file.
   InitGlobals();
 
-  PyMessageMeta_Type.tp_base = &PyType_Type;
-  if (PyType_Ready(&PyMessageMeta_Type) < 0) {
+  CMessageClass_Type.tp_base = &PyType_Type;
+  if (PyType_Ready(&CMessageClass_Type) < 0) {
     return false;
   }
   PyModule_AddObject(m, "MessageMeta",
-                     reinterpret_cast<PyObject*>(&PyMessageMeta_Type));
+                     reinterpret_cast<PyObject*>(&CMessageClass_Type));
 
   if (PyType_Ready(&CMessage_Type) < 0) {
     return false;
@@ -2839,25 +2935,6 @@
   // DESCRIPTOR is set on each protocol buffer message class elsewhere, but set
   // it here as well to document that subclasses need to set it.
   PyDict_SetItem(CMessage_Type.tp_dict, kDESCRIPTOR, Py_None);
-  // Subclasses with message extensions will override _extensions_by_name and
-  // _extensions_by_number with fresh mutable dictionaries in AddDescriptors.
-  // All other classes can share this same immutable mapping.
-  ScopedPyObjectPtr empty_dict(PyDict_New());
-  if (empty_dict == NULL) {
-    return false;
-  }
-  ScopedPyObjectPtr immutable_dict(PyDictProxy_New(empty_dict.get()));
-  if (immutable_dict == NULL) {
-    return false;
-  }
-  if (PyDict_SetItem(CMessage_Type.tp_dict,
-                     k_extensions_by_name, immutable_dict.get()) < 0) {
-    return false;
-  }
-  if (PyDict_SetItem(CMessage_Type.tp_dict,
-                     k_extensions_by_number, immutable_dict.get()) < 0) {
-    return false;
-  }
 
   PyModule_AddObject(m, "Message", reinterpret_cast<PyObject*>(&CMessage_Type));
 
@@ -2903,69 +2980,15 @@
   }
 
   // Initialize Map container types.
-  {
-    // ScalarMapContainer_Type derives from our MutableMapping type.
-    ScopedPyObjectPtr containers(PyImport_ImportModule(
-        "google.protobuf.internal.containers"));
-    if (containers == NULL) {
-      return false;
-    }
-
-    ScopedPyObjectPtr mutable_mapping(
-        PyObject_GetAttrString(containers.get(), "MutableMapping"));
-    if (mutable_mapping == NULL) {
-      return false;
-    }
-
-    if (!PyObject_TypeCheck(mutable_mapping.get(), &PyType_Type)) {
-      return false;
-    }
-
-    Py_INCREF(mutable_mapping.get());
-#if PY_MAJOR_VERSION >= 3
-    PyObject* bases = PyTuple_New(1);
-    PyTuple_SET_ITEM(bases, 0, mutable_mapping.get());
-
-    ScalarMapContainer_Type = 
-        PyType_FromSpecWithBases(&ScalarMapContainer_Type_spec, bases);
-    PyModule_AddObject(m, "ScalarMapContainer", ScalarMapContainer_Type);
-#else
-    ScalarMapContainer_Type.tp_base =
-        reinterpret_cast<PyTypeObject*>(mutable_mapping.get());
-
-    if (PyType_Ready(&ScalarMapContainer_Type) < 0) {
-      return false;
-    }
-
-    PyModule_AddObject(m, "ScalarMapContainer",
-                       reinterpret_cast<PyObject*>(&ScalarMapContainer_Type));
-#endif
-
-    if (PyType_Ready(&MapIterator_Type) < 0) {
-      return false;
-    }
-
-    PyModule_AddObject(m, "MapIterator",
-                       reinterpret_cast<PyObject*>(&MapIterator_Type));
-
-
-#if PY_MAJOR_VERSION >= 3
-    MessageMapContainer_Type = 
-        PyType_FromSpecWithBases(&MessageMapContainer_Type_spec, bases);
-    PyModule_AddObject(m, "MessageMapContainer", MessageMapContainer_Type);
-#else
-    Py_INCREF(mutable_mapping.get());
-    MessageMapContainer_Type.tp_base =
-        reinterpret_cast<PyTypeObject*>(mutable_mapping.get());
-
-    if (PyType_Ready(&MessageMapContainer_Type) < 0) {
-      return false;
-    }
-
-    PyModule_AddObject(m, "MessageMapContainer",
-                       reinterpret_cast<PyObject*>(&MessageMapContainer_Type));
-#endif
+  if (!InitMapContainers()) {
+    return false;
   }
+  PyModule_AddObject(m, "ScalarMapContainer",
+                     reinterpret_cast<PyObject*>(ScalarMapContainer_Type));
+  PyModule_AddObject(m, "MessageMapContainer",
+                     reinterpret_cast<PyObject*>(MessageMapContainer_Type));
+  PyModule_AddObject(m, "MapIterator",
+                     reinterpret_cast<PyObject*>(&MapIterator_Type));
 
   if (PyType_Ready(&ExtensionDict_Type) < 0) {
     return false;
@@ -3000,6 +3023,10 @@
       &PyFileDescriptor_Type));
   PyModule_AddObject(m, "OneofDescriptor", reinterpret_cast<PyObject*>(
       &PyOneofDescriptor_Type));
+  PyModule_AddObject(m, "ServiceDescriptor", reinterpret_cast<PyObject*>(
+      &PyServiceDescriptor_Type));
+  PyModule_AddObject(m, "MethodDescriptor", reinterpret_cast<PyObject*>(
+      &PyMethodDescriptor_Type));
 
   PyObject* enum_type_wrapper = PyImport_ImportModule(
       "google.protobuf.internal.enum_type_wrapper");
@@ -3036,47 +3063,4 @@
 
 }  // namespace python
 }  // namespace protobuf
-
-
-#if PY_MAJOR_VERSION >= 3
-static struct PyModuleDef _module = {
-  PyModuleDef_HEAD_INIT,
-  "_message",
-  google::protobuf::python::module_docstring,
-  -1,
-  NULL,
-  NULL,
-  NULL,
-  NULL,
-  NULL
-};
-#define INITFUNC PyInit__message
-#define INITFUNC_ERRORVAL NULL
-#else  // Python 2
-#define INITFUNC init_message
-#define INITFUNC_ERRORVAL
-#endif
-
-extern "C" {
-  PyMODINIT_FUNC INITFUNC(void) {
-    PyObject* m;
-#if PY_MAJOR_VERSION >= 3
-    m = PyModule_Create(&_module);
-#else
-    m = Py_InitModule3("_message", NULL, google::protobuf::python::module_docstring);
-#endif
-    if (m == NULL) {
-      return INITFUNC_ERRORVAL;
-    }
-
-    if (!google::protobuf::python::InitProto2MessageModule(m)) {
-      Py_DECREF(m);
-      return INITFUNC_ERRORVAL;
-    }
-
-#if PY_MAJOR_VERSION >= 3
-    return m;
-#endif
-  }
-}
 }  // namespace google
diff --git a/python/google/protobuf/pyext/message.h b/python/google/protobuf/pyext/message.h
index cc0012e..72bcfa8 100644
--- a/python/google/protobuf/pyext/message.h
+++ b/python/google/protobuf/pyext/message.h
@@ -37,11 +37,11 @@
 #include <Python.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 #include <string>
 
+#include <google/protobuf/stubs/common.h>
+#include <google/protobuf/pyext/thread_unsafe_shared_ptr.h>
+
 namespace google {
 namespace protobuf {
 
@@ -52,17 +52,10 @@
 class DescriptorPool;
 class MessageFactory;
 
-#ifdef _SHARED_PTR_H
-using std::shared_ptr;
-using std::string;
-#else
-using internal::shared_ptr;
-#endif
-
 namespace python {
 
 struct ExtensionDict;
-struct PyDescriptorPool;
+struct PyMessageFactory;
 
 typedef struct CMessage {
   PyObject_HEAD;
@@ -71,7 +64,9 @@
   // proto tree.  Every Python CMessage holds a reference to it in
   // order to keep it alive as long as there's a Python object that
   // references any part of the tree.
-  shared_ptr<Message> owner;
+
+  typedef ThreadUnsafeSharedPtr<Message> OwnerRef;
+  OwnerRef owner;
 
   // Weak reference to a parent CMessage object. This is NULL for any top-level
   // message and is set for any child message (i.e. a child submessage or a
@@ -112,24 +107,48 @@
   // Similar to composite_fields, acting as a cache, but also contains the
   // required extension dict logic.
   ExtensionDict* extensions;
+
+  // Implements the "weakref" protocol for this object.
+  PyObject* weakreflist;
 } CMessage;
 
+extern PyTypeObject CMessageClass_Type;
 extern PyTypeObject CMessage_Type;
 
+
+// The (meta) type of all Messages classes.
+// It allows us to cache some C++ pointers in the class object itself, they are
+// faster to extract than from the type's dictionary.
+
+struct CMessageClass {
+  // This is how CPython subclasses C structures: the base structure must be
+  // the first member of the object.
+  PyHeapTypeObject super;
+
+  // C++ descriptor of this message.
+  const Descriptor* message_descriptor;
+
+  // Owned reference, used to keep the pointer above alive.
+  PyObject* py_message_descriptor;
+
+  // The Python MessageFactory used to create the class. It is needed to resolve
+  // fields descriptors, including extensions fields; its C++ MessageFactory is
+  // used to instantiate submessages.
+  // We own the reference, because it's important to keep the factory alive.
+  PyMessageFactory* py_message_factory;
+
+  PyObject* AsPyObject() {
+    return reinterpret_cast<PyObject*>(this);
+  }
+};
+
+
 namespace cmessage {
 
 // Internal function to create a new empty Message Python object, but with empty
 // pointers to the C++ objects.
 // The caller must fill self->message, self->owner and eventually self->parent.
-CMessage* NewEmptyMessage(PyObject* type, const Descriptor* descriptor);
-
-// Release a submessage from its proto tree, making it a new top-level messgae.
-// A new message will be created if this is a read-only default instance.
-//
-// Corresponds to reflection api method ReleaseMessage.
-int ReleaseSubMessage(CMessage* self,
-                      const FieldDescriptor* field_descriptor,
-                      CMessage* child_cmessage);
+CMessage* NewEmptyMessage(CMessageClass* type);
 
 // Retrieves the C++ descriptor of a Python Extension descriptor.
 // On error, return NULL with an exception set.
@@ -206,37 +225,44 @@
 PyObject* HasField(CMessage* self, PyObject* arg);
 
 // Initializes values of fields on a newly constructed message.
-int InitAttributes(CMessage* self, PyObject* kwargs);
+// Note that positional arguments are disallowed: 'args' must be NULL or the
+// empty tuple.
+int InitAttributes(CMessage* self, PyObject* args, PyObject* kwargs);
 
 PyObject* MergeFrom(CMessage* self, PyObject* arg);
 
-// Retrieves an attribute named 'name' from CMessage 'self'. Returns
-// the attribute value on success, or NULL on failure.
+// This method does not do anything beyond checking that no other extension
+// has been registered with the same field number on this class.
+PyObject* RegisterExtension(PyObject* cls, PyObject* extension_handle);
+
+// Retrieves an attribute named 'name' from 'self', which is interpreted as a
+// CMessage. Returns the attribute value on success, or null on failure.
 //
 // Returns a new reference.
-PyObject* GetAttr(CMessage* self, PyObject* name);
+PyObject* GetAttr(PyObject* self, PyObject* name);
 
-// Set the value of the attribute named 'name', for CMessage 'self',
-// to the value 'value'. Returns -1 on failure.
-int SetAttr(CMessage* self, PyObject* name, PyObject* value);
+// Set the value of the attribute named 'name', for 'self', which is interpreted
+// as a CMessage, to the value 'value'. Returns -1 on failure.
+int SetAttr(PyObject* self, PyObject* name, PyObject* value);
 
 PyObject* FindInitializationErrors(CMessage* self);
 
 // Set the owner field of self and any children of self, recursively.
 // Used when self is being released and thus has a new owner (the
 // released Message.)
-int SetOwner(CMessage* self, const shared_ptr<Message>& new_owner);
+int SetOwner(CMessage* self, const CMessage::OwnerRef& new_owner);
 
 int AssureWritable(CMessage* self);
 
-// Returns the "best" DescriptorPool for the given message.
-// This is often equivalent to message.DESCRIPTOR.pool, but not always, when
-// the message class was created from a MessageFactory using a custom pool which
-// uses the generated pool as an underlay.
+// Returns the message factory for the given message.
+// This is equivalent to message.MESSAGE_FACTORY
 //
-// The returned pool is suitable for finding fields and building submessages,
+// The returned factory is suitable for finding fields and building submessages,
 // even in the case of extensions.
-PyDescriptorPool* GetDescriptorPoolForMessage(CMessage* message);
+// Returns a *borrowed* reference, and never fails because we pass a CMessage.
+PyMessageFactory* GetFactoryForMessage(CMessage* message);
+
+PyObject* SetAllowOversizeProtos(PyObject* m, PyObject* arg);
 
 }  // namespace cmessage
 
@@ -249,25 +275,25 @@
 
 #define GOOGLE_CHECK_GET_INT32(arg, value, err)                        \
     int32 value;                                            \
-    if (!CheckAndGetInteger(arg, &value, kint32min_py, kint32max_py)) { \
+    if (!CheckAndGetInteger(arg, &value)) { \
       return err;                                          \
     }
 
 #define GOOGLE_CHECK_GET_INT64(arg, value, err)                        \
     int64 value;                                            \
-    if (!CheckAndGetInteger(arg, &value, kint64min_py, kint64max_py)) { \
+    if (!CheckAndGetInteger(arg, &value)) { \
       return err;                                          \
     }
 
 #define GOOGLE_CHECK_GET_UINT32(arg, value, err)                       \
     uint32 value;                                           \
-    if (!CheckAndGetInteger(arg, &value, kPythonZero, kuint32max_py)) { \
+    if (!CheckAndGetInteger(arg, &value)) { \
       return err;                                          \
     }
 
 #define GOOGLE_CHECK_GET_UINT64(arg, value, err)                       \
     uint64 value;                                           \
-    if (!CheckAndGetInteger(arg, &value, kPythonZero, kuint64max_py)) { \
+    if (!CheckAndGetInteger(arg, &value)) { \
       return err;                                          \
     }
 
@@ -290,20 +316,11 @@
     }
 
 
-extern PyObject* kPythonZero;
-extern PyObject* kint32min_py;
-extern PyObject* kint32max_py;
-extern PyObject* kuint32max_py;
-extern PyObject* kint64min_py;
-extern PyObject* kint64max_py;
-extern PyObject* kuint64max_py;
-
 #define FULL_MODULE_NAME "google.protobuf.pyext._message"
 
 void FormatTypeError(PyObject* arg, char* expected_types);
 template<class T>
-bool CheckAndGetInteger(
-    PyObject* arg, T* value, PyObject* min, PyObject* max);
+bool CheckAndGetInteger(PyObject* arg, T* value);
 bool CheckAndGetDouble(PyObject* arg, double* value);
 bool CheckAndGetFloat(PyObject* arg, float* value);
 bool CheckAndGetBool(PyObject* arg, bool* value);
@@ -314,7 +331,8 @@
     const Reflection* reflection,
     bool append,
     int index);
-PyObject* ToStringObject(const FieldDescriptor* descriptor, string value);
+PyObject* ToStringObject(const FieldDescriptor* descriptor,
+                         const string& value);
 
 // Check if the passed field descriptor belongs to the given message.
 // If not, return false and set a Python exception (a KeyError)
@@ -323,6 +341,17 @@
 
 extern PyObject* PickleError_class;
 
+bool InitProto2MessageModule(PyObject *m);
+
+#if LANG_CXX11
+// These are referenced by repeated_scalar_container, and must
+// be explicitly instantiated.
+extern template bool CheckAndGetInteger<int32>(PyObject*, int32*);
+extern template bool CheckAndGetInteger<int64>(PyObject*, int64*);
+extern template bool CheckAndGetInteger<uint32>(PyObject*, uint32*);
+extern template bool CheckAndGetInteger<uint64>(PyObject*, uint64*);
+#endif
+
 }  // namespace python
 }  // namespace protobuf
 
diff --git a/python/google/protobuf/pyext/message_factory.cc b/python/google/protobuf/pyext/message_factory.cc
new file mode 100644
index 0000000..bacc76a
--- /dev/null
+++ b/python/google/protobuf/pyext/message_factory.cc
@@ -0,0 +1,283 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+#include <Python.h>
+
+#include <google/protobuf/dynamic_message.h>
+#include <google/protobuf/pyext/descriptor.h>
+#include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/message_factory.h>
+#include <google/protobuf/pyext/scoped_pyobject_ptr.h>
+
+#if PY_MAJOR_VERSION >= 3
+  #if PY_VERSION_HEX < 0x03030000
+    #error "Python 3.0 - 3.2 are not supported."
+  #endif
+  #define PyString_AsStringAndSize(ob, charpp, sizep) \
+    (PyUnicode_Check(ob)? \
+       ((*(charpp) = PyUnicode_AsUTF8AndSize(ob, (sizep))) == NULL? -1: 0): \
+       PyBytes_AsStringAndSize(ob, (charpp), (sizep)))
+#endif
+
+namespace google {
+namespace protobuf {
+namespace python {
+
+namespace message_factory {
+
+PyMessageFactory* NewMessageFactory(PyTypeObject* type, PyDescriptorPool* pool) {
+  PyMessageFactory* factory = reinterpret_cast<PyMessageFactory*>(
+      PyType_GenericAlloc(type, 0));
+  if (factory == NULL) {
+    return NULL;
+  }
+
+  DynamicMessageFactory* message_factory = new DynamicMessageFactory();
+  // This option might be the default some day.
+  message_factory->SetDelegateToGeneratedFactory(true);
+  factory->message_factory = message_factory;
+
+  factory->pool = pool;
+  // TODO(amauryfa): When the MessageFactory is not created from the
+  // DescriptorPool this reference should be owned, not borrowed.
+  // Py_INCREF(pool);
+
+  factory->classes_by_descriptor = new PyMessageFactory::ClassesByMessageMap();
+
+  return factory;
+}
+
+PyObject* New(PyTypeObject* type, PyObject* args, PyObject* kwargs) {
+  static char* kwlist[] = {"pool", 0};
+  PyObject* pool = NULL;
+  if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|O", kwlist, &pool)) {
+    return NULL;
+  }
+  ScopedPyObjectPtr owned_pool;
+  if (pool == NULL || pool == Py_None) {
+    owned_pool.reset(PyObject_CallFunction(
+        reinterpret_cast<PyObject*>(&PyDescriptorPool_Type), NULL));
+    if (owned_pool == NULL) {
+      return NULL;
+    }
+    pool = owned_pool.get();
+  } else {
+    if (!PyObject_TypeCheck(pool, &PyDescriptorPool_Type)) {
+      PyErr_Format(PyExc_TypeError, "Expected a DescriptorPool, got %s",
+                   pool->ob_type->tp_name);
+      return NULL;
+    }
+  }
+
+  return reinterpret_cast<PyObject*>(
+      NewMessageFactory(type, reinterpret_cast<PyDescriptorPool*>(pool)));
+}
+
+static void Dealloc(PyObject* pself) {
+  PyMessageFactory* self = reinterpret_cast<PyMessageFactory*>(pself);
+
+  // TODO(amauryfa): When the MessageFactory is not created from the
+  // DescriptorPool this reference should be owned, not borrowed.
+  // Py_CLEAR(self->pool);
+  typedef PyMessageFactory::ClassesByMessageMap::iterator iterator;
+  for (iterator it = self->classes_by_descriptor->begin();
+       it != self->classes_by_descriptor->end(); ++it) {
+    Py_DECREF(it->second);
+  }
+  delete self->classes_by_descriptor;
+  delete self->message_factory;
+  Py_TYPE(self)->tp_free(pself);
+}
+
+// Add a message class to our database.
+int RegisterMessageClass(PyMessageFactory* self,
+                         const Descriptor* message_descriptor,
+                         CMessageClass* message_class) {
+  Py_INCREF(message_class);
+  typedef PyMessageFactory::ClassesByMessageMap::iterator iterator;
+  std::pair<iterator, bool> ret = self->classes_by_descriptor->insert(
+      std::make_pair(message_descriptor, message_class));
+  if (!ret.second) {
+    // Update case: DECREF the previous value.
+    Py_DECREF(ret.first->second);
+    ret.first->second = message_class;
+  }
+  return 0;
+}
+
+CMessageClass* GetOrCreateMessageClass(PyMessageFactory* self,
+                                       const Descriptor* descriptor) {
+  // This is the same implementation as MessageFactory.GetPrototype().
+
+  // Do not create a MessageClass that already exists.
+  hash_map<const Descriptor*, CMessageClass*>::iterator it =
+      self->classes_by_descriptor->find(descriptor);
+  if (it != self->classes_by_descriptor->end()) {
+    Py_INCREF(it->second);
+    return it->second;
+  }
+  ScopedPyObjectPtr py_descriptor(
+      PyMessageDescriptor_FromDescriptor(descriptor));
+  if (py_descriptor == NULL) {
+    return NULL;
+  }
+  // Create a new message class.
+  ScopedPyObjectPtr args(Py_BuildValue(
+      "s(){sOsOsO}", descriptor->name().c_str(),
+      "DESCRIPTOR", py_descriptor.get(),
+      "__module__", Py_None,
+      "message_factory", self));
+  if (args == NULL) {
+    return NULL;
+  }
+  ScopedPyObjectPtr message_class(PyObject_CallObject(
+      reinterpret_cast<PyObject*>(&CMessageClass_Type), args.get()));
+  if (message_class == NULL) {
+    return NULL;
+  }
+  // Create messages class for the messages used by the fields, and registers
+  // all extensions for these messages during the recursion.
+  for (int field_idx = 0; field_idx < descriptor->field_count(); field_idx++) {
+    const Descriptor* sub_descriptor =
+        descriptor->field(field_idx)->message_type();
+    // It is NULL if the field type is not a message.
+    if (sub_descriptor != NULL) {
+      CMessageClass* result = GetOrCreateMessageClass(self, sub_descriptor);
+      if (result == NULL) {
+        return NULL;
+      }
+      Py_DECREF(result);
+    }
+  }
+
+  // Register extensions defined in this message.
+  for (int ext_idx = 0 ; ext_idx < descriptor->extension_count() ; ext_idx++) {
+    const FieldDescriptor* extension = descriptor->extension(ext_idx);
+    ScopedPyObjectPtr py_extended_class(
+        GetOrCreateMessageClass(self, extension->containing_type())
+            ->AsPyObject());
+    if (py_extended_class == NULL) {
+      return NULL;
+    }
+    ScopedPyObjectPtr py_extension(PyFieldDescriptor_FromDescriptor(extension));
+    if (py_extension == NULL) {
+      return NULL;
+    }
+    ScopedPyObjectPtr result(cmessage::RegisterExtension(
+        py_extended_class.get(), py_extension.get()));
+    if (result == NULL) {
+      return NULL;
+    }
+  }
+  return reinterpret_cast<CMessageClass*>(message_class.release());
+}
+
+// Retrieve the message class added to our database.
+CMessageClass* GetMessageClass(PyMessageFactory* self,
+                               const Descriptor* message_descriptor) {
+  typedef PyMessageFactory::ClassesByMessageMap::iterator iterator;
+  iterator ret = self->classes_by_descriptor->find(message_descriptor);
+  if (ret == self->classes_by_descriptor->end()) {
+    PyErr_Format(PyExc_TypeError, "No message class registered for '%s'",
+                 message_descriptor->full_name().c_str());
+    return NULL;
+  } else {
+    return ret->second;
+  }
+}
+
+static PyMethodDef Methods[] = {
+    {NULL}};
+
+static PyObject* GetPool(PyMessageFactory* self, void* closure) {
+  Py_INCREF(self->pool);
+  return reinterpret_cast<PyObject*>(self->pool);
+}
+
+static PyGetSetDef Getters[] = {
+    {"pool", (getter)GetPool, NULL, "DescriptorPool"},
+    {NULL}
+};
+
+}  // namespace message_factory
+
+PyTypeObject PyMessageFactory_Type = {
+    PyVarObject_HEAD_INIT(&PyType_Type, 0) FULL_MODULE_NAME
+    ".MessageFactory",                        // tp_name
+    sizeof(PyMessageFactory),                 // tp_basicsize
+    0,                                        // tp_itemsize
+    message_factory::Dealloc,                 // tp_dealloc
+    0,                                        // tp_print
+    0,                                        // tp_getattr
+    0,                                        // tp_setattr
+    0,                                        // tp_compare
+    0,                                        // tp_repr
+    0,                                        // tp_as_number
+    0,                                        // tp_as_sequence
+    0,                                        // tp_as_mapping
+    0,                                        // tp_hash
+    0,                                        // tp_call
+    0,                                        // tp_str
+    0,                                        // tp_getattro
+    0,                                        // tp_setattro
+    0,                                        // tp_as_buffer
+    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE,  // tp_flags
+    "A static Message Factory",               // tp_doc
+    0,                                        // tp_traverse
+    0,                                        // tp_clear
+    0,                                        // tp_richcompare
+    0,                                        // tp_weaklistoffset
+    0,                                        // tp_iter
+    0,                                        // tp_iternext
+    message_factory::Methods,                 // tp_methods
+    0,                                        // tp_members
+    message_factory::Getters,                 // tp_getset
+    0,                                        // tp_base
+    0,                                        // tp_dict
+    0,                                        // tp_descr_get
+    0,                                        // tp_descr_set
+    0,                                        // tp_dictoffset
+    0,                                        // tp_init
+    0,                                        // tp_alloc
+    message_factory::New,                     // tp_new
+    PyObject_Del,                             // tp_free
+};
+
+bool InitMessageFactory() {
+  if (PyType_Ready(&PyMessageFactory_Type) < 0) {
+    return false;
+  }
+
+  return true;
+}
+
+}  // namespace python
+}  // namespace protobuf
+}  // namespace google
diff --git a/python/google/protobuf/pyext/message_factory.h b/python/google/protobuf/pyext/message_factory.h
new file mode 100644
index 0000000..36092f7
--- /dev/null
+++ b/python/google/protobuf/pyext/message_factory.h
@@ -0,0 +1,103 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+#ifndef GOOGLE_PROTOBUF_PYTHON_CPP_MESSAGE_FACTORY_H__
+#define GOOGLE_PROTOBUF_PYTHON_CPP_MESSAGE_FACTORY_H__
+
+#include <Python.h>
+
+#include <google/protobuf/stubs/hash.h>
+#include <google/protobuf/descriptor.h>
+#include <google/protobuf/pyext/descriptor_pool.h>
+
+namespace google {
+namespace protobuf {
+class MessageFactory;
+
+namespace python {
+
+// The (meta) type of all Messages classes.
+struct CMessageClass;
+
+struct PyMessageFactory {
+  PyObject_HEAD
+
+  // DynamicMessageFactory used to create C++ instances of messages.
+  // This object cache the descriptors that were used, so the DescriptorPool
+  // needs to get rid of it before it can delete itself.
+  //
+  // Note: A C++ MessageFactory is different from the PyMessageFactory.
+  // The C++ one creates messages, when the Python one creates classes.
+  MessageFactory* message_factory;
+
+  // borrowed reference to a Python DescriptorPool.
+  // TODO(amauryfa): invert the dependency: the MessageFactory owns the
+  // DescriptorPool, not the opposite.
+  PyDescriptorPool* pool;
+
+  // Make our own mapping to retrieve Python classes from C++ descriptors.
+  //
+  // Descriptor pointers stored here are owned by the DescriptorPool above.
+  // Python references to classes are owned by this PyDescriptorPool.
+  typedef hash_map<const Descriptor*, CMessageClass*> ClassesByMessageMap;
+  ClassesByMessageMap* classes_by_descriptor;
+};
+
+extern PyTypeObject PyMessageFactory_Type;
+
+namespace message_factory {
+
+// Creates a new MessageFactory instance.
+PyMessageFactory* NewMessageFactory(PyTypeObject* type, PyDescriptorPool* pool);
+
+// Registers a new Python class for the given message descriptor.
+// On error, returns -1 with a Python exception set.
+int RegisterMessageClass(PyMessageFactory* self,
+                         const Descriptor* message_descriptor,
+                         CMessageClass* message_class);
+// Retrieves the Python class registered with the given message descriptor, or
+// fail with a TypeError. Returns a *borrowed* reference.
+CMessageClass* GetMessageClass(PyMessageFactory* self,
+                               const Descriptor* message_descriptor);
+// Retrieves the Python class registered with the given message descriptor.
+// The class is created if not done yet. Returns a *new* reference.
+CMessageClass* GetOrCreateMessageClass(PyMessageFactory* self,
+                                       const Descriptor* message_descriptor);
+}  // namespace message_factory
+
+// Initialize objects used by this module.
+// On error, returns false with a Python exception set.
+bool InitMessageFactory();
+
+}  // namespace python
+}  // namespace protobuf
+
+}  // namespace google
+#endif  // GOOGLE_PROTOBUF_PYTHON_CPP_MESSAGE_FACTORY_H__
diff --git a/python/google/protobuf/pyext/message_module.cc b/python/google/protobuf/pyext/message_module.cc
new file mode 100644
index 0000000..7c4df47
--- /dev/null
+++ b/python/google/protobuf/pyext/message_module.cc
@@ -0,0 +1,121 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+#include <Python.h>
+
+#include <google/protobuf/pyext/message.h>
+
+#include <google/protobuf/message_lite.h>
+
+static PyObject* GetPythonProto3PreserveUnknownsDefault(
+    PyObject* /*m*/, PyObject* /*args*/) {
+  if (google::protobuf::internal::GetProto3PreserveUnknownsDefault()) {
+    Py_RETURN_TRUE;
+  } else {
+    Py_RETURN_FALSE;
+  }
+}
+
+static PyObject* SetPythonProto3PreserveUnknownsDefault(
+    PyObject* /*m*/, PyObject* arg) {
+  if (!arg || !PyBool_Check(arg)) {
+    PyErr_SetString(
+        PyExc_TypeError,
+        "Argument to SetPythonProto3PreserveUnknownsDefault must be boolean");
+    return NULL;
+  }
+  google::protobuf::internal::SetProto3PreserveUnknownsDefault(PyObject_IsTrue(arg));
+  Py_RETURN_NONE;
+}
+
+static const char module_docstring[] =
+"python-proto2 is a module that can be used to enhance proto2 Python API\n"
+"performance.\n"
+"\n"
+"It provides access to the protocol buffers C++ reflection API that\n"
+"implements the basic protocol buffer functions.";
+
+static PyMethodDef ModuleMethods[] = {
+  {"SetAllowOversizeProtos",
+    (PyCFunction)google::protobuf::python::cmessage::SetAllowOversizeProtos,
+    METH_O, "Enable/disable oversize proto parsing."},
+  // DO NOT USE: For migration and testing only.
+  {"GetPythonProto3PreserveUnknownsDefault",
+    (PyCFunction)GetPythonProto3PreserveUnknownsDefault,
+    METH_NOARGS, "Get Proto3 preserve unknowns default."},
+  // DO NOT USE: For migration and testing only.
+  {"SetPythonProto3PreserveUnknownsDefault",
+    (PyCFunction)SetPythonProto3PreserveUnknownsDefault,
+    METH_O, "Enable/disable proto3 unknowns preservation."},
+  { NULL, NULL}
+};
+
+#if PY_MAJOR_VERSION >= 3
+static struct PyModuleDef _module = {
+  PyModuleDef_HEAD_INIT,
+  "_message",
+  module_docstring,
+  -1,
+  ModuleMethods,  /* m_methods */
+  NULL,
+  NULL,
+  NULL,
+  NULL
+};
+#define INITFUNC PyInit__message
+#define INITFUNC_ERRORVAL NULL
+#else  // Python 2
+#define INITFUNC init_message
+#define INITFUNC_ERRORVAL
+#endif
+
+extern "C" {
+  PyMODINIT_FUNC INITFUNC(void) {
+    PyObject* m;
+#if PY_MAJOR_VERSION >= 3
+    m = PyModule_Create(&_module);
+#else
+    m = Py_InitModule3("_message", ModuleMethods,
+                       module_docstring);
+#endif
+    if (m == NULL) {
+      return INITFUNC_ERRORVAL;
+    }
+
+    if (!google::protobuf::python::InitProto2MessageModule(m)) {
+      Py_DECREF(m);
+      return INITFUNC_ERRORVAL;
+    }
+
+#if PY_MAJOR_VERSION >= 3
+    return m;
+#endif
+  }
+}
diff --git a/python/google/protobuf/pyext/python.proto b/python/google/protobuf/pyext/python.proto
index cce645d..2e50df7 100644
--- a/python/google/protobuf/pyext/python.proto
+++ b/python/google/protobuf/pyext/python.proto
@@ -58,11 +58,11 @@
   repeated int32 d = 2;
 }
 
-message TestAllExtensions {
+message TestAllExtensions {  // extension begin
   extensions 1 to max;
-}
+}  // extension end
 
-extend TestAllExtensions {
+extend TestAllExtensions {  // extension begin
   optional TestAllTypes.NestedMessage optional_nested_message_extension = 1;
   repeated TestAllTypes.NestedMessage repeated_nested_message_extension = 2;
-}
+}  // extension end
diff --git a/python/google/protobuf/pyext/repeated_composite_container.cc b/python/google/protobuf/pyext/repeated_composite_container.cc
index b01123b..5874d5d 100644
--- a/python/google/protobuf/pyext/repeated_composite_container.cc
+++ b/python/google/protobuf/pyext/repeated_composite_container.cc
@@ -34,9 +34,6 @@
 #include <google/protobuf/pyext/repeated_composite_container.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 
 #include <google/protobuf/stubs/logging.h>
 #include <google/protobuf/stubs/common.h>
@@ -46,7 +43,9 @@
 #include <google/protobuf/pyext/descriptor.h>
 #include <google/protobuf/pyext/descriptor_pool.h>
 #include <google/protobuf/pyext/message.h>
+#include <google/protobuf/pyext/message_factory.h>
 #include <google/protobuf/pyext/scoped_pyobject_ptr.h>
+#include <google/protobuf/reflection.h>
 
 #if PY_MAJOR_VERSION >= 3
   #define PyInt_Check PyLong_Check
@@ -79,7 +78,10 @@
 // ---------------------------------------------------------------------
 // len()
 
-static Py_ssize_t Length(RepeatedCompositeContainer* self) {
+static Py_ssize_t Length(PyObject* pself) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   Message* message = self->message;
   if (message != NULL) {
     return message->GetReflection()->FieldSize(*message,
@@ -100,15 +102,14 @@
   // A MergeFrom on a parent message could have caused extra messages to be
   // added in the underlying protobuf so add them to our list. They can never
   // be removed in such a way so there's no need to worry about that.
-  Py_ssize_t message_length = Length(self);
+  Py_ssize_t message_length = Length(reinterpret_cast<PyObject*>(self));
   Py_ssize_t child_length = PyList_GET_SIZE(self->child_messages);
   Message* message = self->message;
   const Reflection* reflection = message->GetReflection();
   for (Py_ssize_t i = child_length; i < message_length; ++i) {
     const Message& sub_message = reflection->GetRepeatedMessage(
         *(self->message), self->parent_field_descriptor, i);
-    CMessage* cmsg = cmessage::NewEmptyMessage(self->subclass_init,
-                                               sub_message.GetDescriptor());
+    CMessage* cmsg = cmessage::NewEmptyMessage(self->child_message_class);
     ScopedPyObjectPtr py_cmsg(reinterpret_cast<PyObject*>(cmsg));
     if (cmsg == NULL) {
       return -1;
@@ -137,18 +138,20 @@
   if (cmessage::AssureWritable(self->parent) == -1)
     return NULL;
   Message* message = self->message;
+
   Message* sub_message =
-      message->GetReflection()->AddMessage(message,
-                                           self->parent_field_descriptor);
-  CMessage* cmsg = cmessage::NewEmptyMessage(self->subclass_init,
-                                             sub_message->GetDescriptor());
+      message->GetReflection()->AddMessage(
+          message,
+          self->parent_field_descriptor,
+          self->child_message_class->py_message_factory->message_factory);
+  CMessage* cmsg = cmessage::NewEmptyMessage(self->child_message_class);
   if (cmsg == NULL)
     return NULL;
 
   cmsg->owner = self->owner;
   cmsg->message = sub_message;
   cmsg->parent = self->parent;
-  if (cmessage::InitAttributes(cmsg, kwargs) < 0) {
+  if (cmessage::InitAttributes(cmsg, args, kwargs) < 0) {
     Py_DECREF(cmsg);
     return NULL;
   }
@@ -168,7 +171,7 @@
 
   // Create a new Message detached from the rest.
   PyObject* py_cmsg = PyEval_CallObjectWithKeywords(
-      self->subclass_init, NULL, kwargs);
+      self->child_message_class->AsPyObject(), args, kwargs);
   if (py_cmsg == NULL)
     return NULL;
 
@@ -188,6 +191,10 @@
     return AddToAttached(self, args, kwargs);
 }
 
+static PyObject* AddMethod(PyObject* self, PyObject* args, PyObject* kwargs) {
+  return Add(reinterpret_cast<RepeatedCompositeContainer*>(self), args, kwargs);
+}
+
 // ---------------------------------------------------------------------
 // extend()
 
@@ -223,6 +230,10 @@
   Py_RETURN_NONE;
 }
 
+static PyObject* ExtendMethod(PyObject* self, PyObject* value) {
+  return Extend(reinterpret_cast<RepeatedCompositeContainer*>(self), value);
+}
+
 PyObject* MergeFrom(RepeatedCompositeContainer* self, PyObject* other) {
   if (UpdateChildMessages(self) < 0) {
     return NULL;
@@ -230,6 +241,10 @@
   return Extend(self, other);
 }
 
+static PyObject* MergeFromMethod(PyObject* self, PyObject* other) {
+  return MergeFrom(reinterpret_cast<RepeatedCompositeContainer*>(self), other);
+}
+
 PyObject* Subscript(RepeatedCompositeContainer* self, PyObject* slice) {
   if (UpdateChildMessages(self) < 0) {
     return NULL;
@@ -239,6 +254,10 @@
   return PyObject_GetItem(self->child_messages, slice);
 }
 
+static PyObject* SubscriptMethod(PyObject* self, PyObject* slice) {
+  return Subscript(reinterpret_cast<RepeatedCompositeContainer*>(self), slice);
+}
+
 int AssignSubscript(RepeatedCompositeContainer* self,
                     PyObject* slice,
                     PyObject* value) {
@@ -262,15 +281,16 @@
     Py_ssize_t from;
     Py_ssize_t to;
     Py_ssize_t step;
-    Py_ssize_t length = Length(self);
+    Py_ssize_t length = Length(reinterpret_cast<PyObject*>(self));
     Py_ssize_t slicelength;
     if (PySlice_Check(slice)) {
 #if PY_MAJOR_VERSION >= 3
       if (PySlice_GetIndicesEx(slice,
+                               length, &from, &to, &step, &slicelength) == -1) {
 #else
       if (PySlice_GetIndicesEx(reinterpret_cast<PySliceObject*>(slice),
-#endif
                                length, &from, &to, &step, &slicelength) == -1) {
+#endif
         return -1;
       }
       return PySequence_DelSlice(self->child_messages, from, to);
@@ -286,7 +306,16 @@
   return 0;
 }
 
-static PyObject* Remove(RepeatedCompositeContainer* self, PyObject* value) {
+static int AssignSubscriptMethod(PyObject* self, PyObject* slice,
+                                 PyObject* value) {
+  return AssignSubscript(reinterpret_cast<RepeatedCompositeContainer*>(self),
+                         slice, value);
+}
+
+static PyObject* Remove(PyObject* pself, PyObject* value) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   if (UpdateChildMessages(self) < 0) {
     return NULL;
   }
@@ -301,9 +330,10 @@
   Py_RETURN_NONE;
 }
 
-static PyObject* RichCompare(RepeatedCompositeContainer* self,
-                             PyObject* other,
-                             int opid) {
+static PyObject* RichCompare(PyObject* pself, PyObject* other, int opid) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   if (UpdateChildMessages(self) < 0) {
     return NULL;
   }
@@ -336,6 +366,19 @@
   }
 }
 
+static PyObject* ToStr(PyObject* pself) {
+  ScopedPyObjectPtr full_slice(PySlice_New(NULL, NULL, NULL));
+  if (full_slice == NULL) {
+    return NULL;
+  }
+  ScopedPyObjectPtr list(Subscript(
+      reinterpret_cast<RepeatedCompositeContainer*>(pself), full_slice.get()));
+  if (list == NULL) {
+    return NULL;
+  }
+  return PyObject_Repr(list.get());
+}
+
 // ---------------------------------------------------------------------
 // sort()
 
@@ -343,7 +386,7 @@
   Message* message = self->message;
   const Reflection* reflection = message->GetReflection();
   const FieldDescriptor* descriptor = self->parent_field_descriptor;
-  const Py_ssize_t length = Length(self);
+  const Py_ssize_t length = Length(reinterpret_cast<PyObject*>(self));
 
   // Since Python protobuf objects are never arena-allocated, adding and
   // removing message pointers to the underlying array is just updating
@@ -366,7 +409,7 @@
   ScopedPyObjectPtr m(PyObject_GetAttrString(self->child_messages, "sort"));
   if (m == NULL)
     return -1;
-  if (PyObject_Call(m.get(), args, kwds) == NULL)
+  if (ScopedPyObjectPtr(PyObject_Call(m.get(), args, kwds)) == NULL)
     return -1;
   if (self->message != NULL) {
     ReorderAttached(self);
@@ -374,9 +417,10 @@
   return 0;
 }
 
-static PyObject* Sort(RepeatedCompositeContainer* self,
-                      PyObject* args,
-                      PyObject* kwds) {
+static PyObject* Sort(PyObject* pself, PyObject* args, PyObject* kwds) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   // Support the old sort_function argument for backwards
   // compatibility.
   if (kwds != NULL) {
@@ -400,11 +444,14 @@
 
 // ---------------------------------------------------------------------
 
-static PyObject* Item(RepeatedCompositeContainer* self, Py_ssize_t index) {
+static PyObject* Item(PyObject* pself, Py_ssize_t index) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   if (UpdateChildMessages(self) < 0) {
     return NULL;
   }
-  Py_ssize_t length = Length(self);
+  Py_ssize_t length = Length(pself);
   if (index < 0) {
     index = length + index;
   }
@@ -416,17 +463,17 @@
   return item;
 }
 
-static PyObject* Pop(RepeatedCompositeContainer* self,
-                     PyObject* args) {
+static PyObject* Pop(PyObject* pself, PyObject* args) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   Py_ssize_t index = -1;
   if (!PyArg_ParseTuple(args, "|n", &index)) {
     return NULL;
   }
-  PyObject* item = Item(self, index);
+  PyObject* item = Item(pself, index);
   if (item == NULL) {
-    PyErr_Format(PyExc_IndexError,
-                 "list index (%zd) out of range",
-                 index);
+    PyErr_Format(PyExc_IndexError, "list index (%zd) out of range", index);
     return NULL;
   }
   ScopedPyObjectPtr py_index(PyLong_FromSsize_t(index));
@@ -444,7 +491,7 @@
   GOOGLE_CHECK_NOTNULL(field);
   GOOGLE_CHECK_NOTNULL(target);
 
-  shared_ptr<Message> released_message(
+  CMessage::OwnerRef released_message(
       parent->message->GetReflection()->ReleaseLast(parent->message, field));
   // TODO(tibell): Deal with proto1.
 
@@ -487,8 +534,37 @@
   return 0;
 }
 
+PyObject* DeepCopy(PyObject* pself, PyObject* arg) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
+  ScopedPyObjectPtr cloneObj(
+      PyType_GenericAlloc(&RepeatedCompositeContainer_Type, 0));
+  if (cloneObj == NULL) {
+    return NULL;
+  }
+  RepeatedCompositeContainer* clone =
+      reinterpret_cast<RepeatedCompositeContainer*>(cloneObj.get());
+
+  Message* new_message = self->message->New();
+  clone->parent = NULL;
+  clone->parent_field_descriptor = self->parent_field_descriptor;
+  clone->message = new_message;
+  clone->owner.reset(new_message);
+  Py_INCREF(self->child_message_class);
+  clone->child_message_class = self->child_message_class;
+  clone->child_messages = PyList_New(0);
+
+  new_message->GetReflection()
+      ->GetMutableRepeatedFieldRef<Message>(new_message,
+                                            self->parent_field_descriptor)
+      .MergeFrom(self->message->GetReflection()->GetRepeatedFieldRef<Message>(
+          *self->message, self->parent_field_descriptor));
+  return cloneObj.release();
+}
+
 int SetOwner(RepeatedCompositeContainer* self,
-             const shared_ptr<Message>& new_owner) {
+             const CMessage::OwnerRef& new_owner) {
   GOOGLE_CHECK_ATTACHED(self);
 
   self->owner = new_owner;
@@ -506,7 +582,7 @@
 PyObject *NewContainer(
     CMessage* parent,
     const FieldDescriptor* parent_field_descriptor,
-    PyObject *concrete_class) {
+    CMessageClass* concrete_class) {
   if (!CheckFieldBelongsToMessage(parent_field_descriptor, parent->message)) {
     return NULL;
   }
@@ -523,47 +599,52 @@
   self->parent_field_descriptor = parent_field_descriptor;
   self->owner = parent->owner;
   Py_INCREF(concrete_class);
-  self->subclass_init = concrete_class;
+  self->child_message_class = concrete_class;
   self->child_messages = PyList_New(0);
 
   return reinterpret_cast<PyObject*>(self);
 }
 
-static void Dealloc(RepeatedCompositeContainer* self) {
+static void Dealloc(PyObject* pself) {
+  RepeatedCompositeContainer* self =
+      reinterpret_cast<RepeatedCompositeContainer*>(pself);
+
   Py_CLEAR(self->child_messages);
-  Py_CLEAR(self->subclass_init);
+  Py_CLEAR(self->child_message_class);
   // TODO(tibell): Do we need to call delete on these objects to make
   // sure their destructors are called?
   self->owner.reset();
 
-  Py_TYPE(self)->tp_free(reinterpret_cast<PyObject*>(self));
+  Py_TYPE(self)->tp_free(pself);
 }
 
 static PySequenceMethods SqMethods = {
-  (lenfunc)Length,        /* sq_length */
-  0, /* sq_concat */
-  0, /* sq_repeat */
-  (ssizeargfunc)Item /* sq_item */
+  Length,                 /* sq_length */
+  0,                      /* sq_concat */
+  0,                      /* sq_repeat */
+  Item                    /* sq_item */
 };
 
 static PyMappingMethods MpMethods = {
-  (lenfunc)Length,               /* mp_length */
-  (binaryfunc)Subscript,      /* mp_subscript */
-  (objobjargproc)AssignSubscript,/* mp_ass_subscript */
+  Length,                 /* mp_length */
+  SubscriptMethod,        /* mp_subscript */
+  AssignSubscriptMethod,  /* mp_ass_subscript */
 };
 
 static PyMethodDef Methods[] = {
-  { "add", (PyCFunction) Add, METH_VARARGS | METH_KEYWORDS,
+  { "__deepcopy__", DeepCopy, METH_VARARGS,
+    "Makes a deep copy of the class." },
+  { "add", (PyCFunction)AddMethod, METH_VARARGS | METH_KEYWORDS,
     "Adds an object to the repeated container." },
-  { "extend", (PyCFunction) Extend, METH_O,
+  { "extend", ExtendMethod, METH_O,
     "Adds objects to the repeated container." },
-  { "pop", (PyCFunction)Pop, METH_VARARGS,
+  { "pop", Pop, METH_VARARGS,
     "Removes an object from the repeated container and returns it." },
-  { "remove", (PyCFunction) Remove, METH_O,
+  { "remove", Remove, METH_O,
     "Removes an object from the repeated container." },
-  { "sort", (PyCFunction) Sort, METH_VARARGS | METH_KEYWORDS,
+  { "sort", (PyCFunction)Sort, METH_VARARGS | METH_KEYWORDS,
     "Sorts the repeated container." },
-  { "MergeFrom", (PyCFunction) MergeFrom, METH_O,
+  { "MergeFrom", MergeFromMethod, METH_O,
     "Adds objects to the repeated container." },
   { NULL, NULL }
 };
@@ -575,12 +656,12 @@
   FULL_MODULE_NAME ".RepeatedCompositeContainer",  // tp_name
   sizeof(RepeatedCompositeContainer),  // tp_basicsize
   0,                                   //  tp_itemsize
-  (destructor)repeated_composite_container::Dealloc,  //  tp_dealloc
+  repeated_composite_container::Dealloc,  //  tp_dealloc
   0,                                   //  tp_print
   0,                                   //  tp_getattr
   0,                                   //  tp_setattr
   0,                                   //  tp_compare
-  0,                                   //  tp_repr
+  repeated_composite_container::ToStr,      //  tp_repr
   0,                                   //  tp_as_number
   &repeated_composite_container::SqMethods,   //  tp_as_sequence
   &repeated_composite_container::MpMethods,   //  tp_as_mapping
@@ -594,7 +675,7 @@
   "A Repeated scalar container",       //  tp_doc
   0,                                   //  tp_traverse
   0,                                   //  tp_clear
-  (richcmpfunc)repeated_composite_container::RichCompare,  //  tp_richcompare
+  repeated_composite_container::RichCompare,  //  tp_richcompare
   0,                                   //  tp_weaklistoffset
   0,                                   //  tp_iter
   0,                                   //  tp_iternext
diff --git a/python/google/protobuf/pyext/repeated_composite_container.h b/python/google/protobuf/pyext/repeated_composite_container.h
index 58d37b0..e5e946a 100644
--- a/python/google/protobuf/pyext/repeated_composite_container.h
+++ b/python/google/protobuf/pyext/repeated_composite_container.h
@@ -37,27 +37,20 @@
 #include <Python.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 #include <string>
 #include <vector>
 
+#include <google/protobuf/pyext/message.h>
+
 namespace google {
 namespace protobuf {
 
 class FieldDescriptor;
 class Message;
 
-#ifdef _SHARED_PTR_H
-using std::shared_ptr;
-#else
-using internal::shared_ptr;
-#endif
-
 namespace python {
 
-struct CMessage;
+struct CMessageClass;
 
 // A RepeatedCompositeContainer can be in one of two states: attached
 // or released.
@@ -76,7 +69,7 @@
   // proto tree.  Every Python RepeatedCompositeContainer holds a
   // reference to it in order to keep it alive as long as there's a
   // Python object that references any part of the tree.
-  shared_ptr<Message> owner;
+  CMessage::OwnerRef owner;
 
   // Weak reference to parent object. May be NULL. Used to make sure
   // the parent is writable before modifying the
@@ -94,8 +87,8 @@
   // calling Clear() or ClearField() on the parent.
   Message* message;
 
-  // A callable that is used to create new child messages.
-  PyObject* subclass_init;
+  // The type used to create new child messages.
+  CMessageClass* child_message_class;
 
   // A list of child messages.
   PyObject* child_messages;
@@ -110,7 +103,7 @@
 PyObject *NewContainer(
     CMessage* parent,
     const FieldDescriptor* parent_field_descriptor,
-    PyObject *concrete_class);
+    CMessageClass *child_message_class);
 
 // Appends a new CMessage to the container and returns it.  The
 // CMessage is initialized using the content of kwargs.
@@ -147,11 +140,6 @@
                     PyObject* slice,
                     PyObject* value);
 
-// Releases the messages in the container to the given message.
-//
-// Returns 0 on success, -1 on failure.
-int ReleaseToMessage(RepeatedCompositeContainer* self, Message* new_message);
-
 // Releases the messages in the container to a new message.
 //
 // Returns 0 on success, -1 on failure.
@@ -159,7 +147,7 @@
 
 // Returns 0 on success, -1 on failure.
 int SetOwner(RepeatedCompositeContainer* self,
-             const shared_ptr<Message>& new_owner);
+             const CMessage::OwnerRef& new_owner);
 
 // Removes the last element of the repeated message field 'field' on
 // the Message 'parent', and transfers the ownership of the released
diff --git a/python/google/protobuf/pyext/repeated_scalar_container.cc b/python/google/protobuf/pyext/repeated_scalar_container.cc
index 95da85f..de3b6e1 100644
--- a/python/google/protobuf/pyext/repeated_scalar_container.cc
+++ b/python/google/protobuf/pyext/repeated_scalar_container.cc
@@ -34,9 +34,6 @@
 #include <google/protobuf/pyext/repeated_scalar_container.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 
 #include <google/protobuf/stubs/common.h>
 #include <google/protobuf/stubs/logging.h>
@@ -77,15 +74,18 @@
   return 0;
 }
 
-static Py_ssize_t Len(RepeatedScalarContainer* self) {
+static Py_ssize_t Len(PyObject* pself) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
   Message* message = self->message;
   return message->GetReflection()->FieldSize(*message,
                                              self->parent_field_descriptor);
 }
 
-static int AssignItem(RepeatedScalarContainer* self,
-                      Py_ssize_t index,
-                      PyObject* arg) {
+static int AssignItem(PyObject* pself, Py_ssize_t index, PyObject* arg) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
+
   cmessage::AssureWritable(self->parent);
   Message* message = self->message;
   const FieldDescriptor* field_descriptor = self->parent_field_descriptor;
@@ -188,7 +188,10 @@
   return 0;
 }
 
-static PyObject* Item(RepeatedScalarContainer* self, Py_ssize_t index) {
+static PyObject* Item(PyObject* pself, Py_ssize_t index) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
+
   Message* message = self->message;
   const FieldDescriptor* field_descriptor = self->parent_field_descriptor;
   const Reflection* reflection = message->GetReflection();
@@ -256,27 +259,12 @@
       break;
     }
     case FieldDescriptor::CPPTYPE_STRING: {
-      string value = reflection->GetRepeatedString(
-          *message, field_descriptor, index);
+      string scratch;
+      const string& value = reflection->GetRepeatedStringReference(
+          *message, field_descriptor, index, &scratch);
       result = ToStringObject(field_descriptor, value);
       break;
     }
-    case FieldDescriptor::CPPTYPE_MESSAGE: {
-      PyObject* py_cmsg = PyObject_CallObject(reinterpret_cast<PyObject*>(
-          &CMessage_Type), NULL);
-      if (py_cmsg == NULL) {
-        return NULL;
-      }
-      CMessage* cmsg = reinterpret_cast<CMessage*>(py_cmsg);
-      const Message& msg = reflection->GetRepeatedMessage(
-          *message, field_descriptor, index);
-      cmsg->owner = self->owner;
-      cmsg->parent = self->parent;
-      cmsg->message = const_cast<Message*>(&msg);
-      cmsg->read_only = false;
-      result = reinterpret_cast<PyObject*>(py_cmsg);
-      break;
-    }
     default:
       PyErr_Format(
           PyExc_SystemError,
@@ -287,7 +275,7 @@
   return result;
 }
 
-static PyObject* Subscript(RepeatedScalarContainer* self, PyObject* slice) {
+static PyObject* Subscript(PyObject* pself, PyObject* slice) {
   Py_ssize_t from;
   Py_ssize_t to;
   Py_ssize_t step;
@@ -302,13 +290,14 @@
   if (PyLong_Check(slice)) {
     from = to = PyLong_AsLong(slice);
   } else if (PySlice_Check(slice)) {
-    length = Len(self);
+    length = Len(pself);
 #if PY_MAJOR_VERSION >= 3
     if (PySlice_GetIndicesEx(slice,
+                             length, &from, &to, &step, &slicelength) == -1) {
 #else
     if (PySlice_GetIndicesEx(reinterpret_cast<PySliceObject*>(slice),
-#endif
                              length, &from, &to, &step, &slicelength) == -1) {
+#endif
       return NULL;
     }
     return_list = true;
@@ -318,7 +307,7 @@
   }
 
   if (!return_list) {
-    return Item(self, from);
+    return Item(pself, from);
   }
 
   PyObject* list = PyList_New(0);
@@ -333,7 +322,7 @@
       if (index < 0 || index >= length) {
         break;
       }
-      ScopedPyObjectPtr s(Item(self, index));
+      ScopedPyObjectPtr s(Item(pself, index));
       PyList_Append(list, s.get());
     }
   } else {
@@ -344,7 +333,7 @@
       if (index < 0 || index >= length) {
         break;
       }
-      ScopedPyObjectPtr s(Item(self, index));
+      ScopedPyObjectPtr s(Item(pself, index));
       PyList_Append(list, s.get());
     }
   }
@@ -431,9 +420,14 @@
   Py_RETURN_NONE;
 }
 
-static int AssSubscript(RepeatedScalarContainer* self,
-                        PyObject* slice,
-                        PyObject* value) {
+static PyObject* AppendMethod(PyObject* self, PyObject* item) {
+  return Append(reinterpret_cast<RepeatedScalarContainer*>(self), item);
+}
+
+static int AssSubscript(PyObject* pself, PyObject* slice, PyObject* value) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
+
   Py_ssize_t from;
   Py_ssize_t to;
   Py_ssize_t step;
@@ -449,7 +443,7 @@
 #if PY_MAJOR_VERSION < 3
   if (PyInt_Check(slice)) {
     from = to = PyInt_AsLong(slice);
-  } else
+  } else  // NOLINT
 #endif
   if (PyLong_Check(slice)) {
     from = to = PyLong_AsLong(slice);
@@ -458,10 +452,11 @@
     length = reflection->FieldSize(*message, field_descriptor);
 #if PY_MAJOR_VERSION >= 3
     if (PySlice_GetIndicesEx(slice,
+                             length, &from, &to, &step, &slicelength) == -1) {
 #else
     if (PySlice_GetIndicesEx(reinterpret_cast<PySliceObject*>(slice),
-#endif
                              length, &from, &to, &step, &slicelength) == -1) {
+#endif
       return -1;
     }
     create_list = true;
@@ -476,14 +471,14 @@
   }
 
   if (!create_list) {
-    return AssignItem(self, from, value);
+    return AssignItem(pself, from, value);
   }
 
   ScopedPyObjectPtr full_slice(PySlice_New(NULL, NULL, NULL));
   if (full_slice == NULL) {
     return -1;
   }
-  ScopedPyObjectPtr new_list(Subscript(self, full_slice.get()));
+  ScopedPyObjectPtr new_list(Subscript(pself, full_slice.get()));
   if (new_list == NULL) {
     return -1;
   }
@@ -522,14 +517,17 @@
   Py_RETURN_NONE;
 }
 
-static PyObject* Insert(RepeatedScalarContainer* self, PyObject* args) {
+static PyObject* Insert(PyObject* pself, PyObject* args) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
+
   Py_ssize_t index;
   PyObject* value;
   if (!PyArg_ParseTuple(args, "lO", &index, &value)) {
     return NULL;
   }
   ScopedPyObjectPtr full_slice(PySlice_New(NULL, NULL, NULL));
-  ScopedPyObjectPtr new_list(Subscript(self, full_slice.get()));
+  ScopedPyObjectPtr new_list(Subscript(pself, full_slice.get()));
   if (PyList_Insert(new_list.get(), index, value) < 0) {
     return NULL;
   }
@@ -540,10 +538,10 @@
   Py_RETURN_NONE;
 }
 
-static PyObject* Remove(RepeatedScalarContainer* self, PyObject* value) {
+static PyObject* Remove(PyObject* pself, PyObject* value) {
   Py_ssize_t match_index = -1;
-  for (Py_ssize_t i = 0; i < Len(self); ++i) {
-    ScopedPyObjectPtr elem(Item(self, i));
+  for (Py_ssize_t i = 0; i < Len(pself); ++i) {
+    ScopedPyObjectPtr elem(Item(pself, i));
     if (PyObject_RichCompareBool(elem.get(), value, Py_EQ)) {
       match_index = i;
       break;
@@ -553,15 +551,17 @@
     PyErr_SetString(PyExc_ValueError, "remove(x): x not in container");
     return NULL;
   }
-  if (AssignItem(self, match_index, NULL) < 0) {
+  if (AssignItem(pself, match_index, NULL) < 0) {
     return NULL;
   }
   Py_RETURN_NONE;
 }
 
-static PyObject* RichCompare(RepeatedScalarContainer* self,
-                             PyObject* other,
-                             int opid) {
+static PyObject* ExtendMethod(PyObject* self, PyObject* value) {
+  return Extend(reinterpret_cast<RepeatedScalarContainer*>(self), value);
+}
+
+static PyObject* RichCompare(PyObject* pself, PyObject* other, int opid) {
   if (opid != Py_EQ && opid != Py_NE) {
     Py_INCREF(Py_NotImplemented);
     return Py_NotImplemented;
@@ -578,28 +578,25 @@
 
   ScopedPyObjectPtr other_list_deleter;
   if (PyObject_TypeCheck(other, &RepeatedScalarContainer_Type)) {
-    other_list_deleter.reset(Subscript(
-        reinterpret_cast<RepeatedScalarContainer*>(other), full_slice.get()));
+    other_list_deleter.reset(Subscript(other, full_slice.get()));
     other = other_list_deleter.get();
   }
 
-  ScopedPyObjectPtr list(Subscript(self, full_slice.get()));
+  ScopedPyObjectPtr list(Subscript(pself, full_slice.get()));
   if (list == NULL) {
     return NULL;
   }
   return PyObject_RichCompare(list.get(), other, opid);
 }
 
-PyObject* Reduce(RepeatedScalarContainer* unused_self) {
+PyObject* Reduce(PyObject* unused_self, PyObject* unused_other) {
   PyErr_Format(
       PickleError_class,
       "can't pickle repeated message fields, convert to list first");
   return NULL;
 }
 
-static PyObject* Sort(RepeatedScalarContainer* self,
-                      PyObject* args,
-                      PyObject* kwds) {
+static PyObject* Sort(PyObject* pself, PyObject* args, PyObject* kwds) {
   // Support the old sort_function argument for backwards
   // compatibility.
   if (kwds != NULL) {
@@ -618,7 +615,7 @@
   if (full_slice == NULL) {
     return NULL;
   }
-  ScopedPyObjectPtr list(Subscript(self, full_slice.get()));
+  ScopedPyObjectPtr list(Subscript(pself, full_slice.get()));
   if (list == NULL) {
     return NULL;
   }
@@ -630,32 +627,42 @@
   if (res == NULL) {
     return NULL;
   }
-  int ret = InternalAssignRepeatedField(self, list.get());
+  int ret = InternalAssignRepeatedField(
+      reinterpret_cast<RepeatedScalarContainer*>(pself), list.get());
   if (ret < 0) {
     return NULL;
   }
   Py_RETURN_NONE;
 }
 
-static PyObject* Pop(RepeatedScalarContainer* self,
-                     PyObject* args) {
+static PyObject* Pop(PyObject* pself, PyObject* args) {
   Py_ssize_t index = -1;
   if (!PyArg_ParseTuple(args, "|n", &index)) {
     return NULL;
   }
-  PyObject* item = Item(self, index);
+  PyObject* item = Item(pself, index);
   if (item == NULL) {
-    PyErr_Format(PyExc_IndexError,
-                 "list index (%zd) out of range",
-                 index);
+    PyErr_Format(PyExc_IndexError, "list index (%zd) out of range", index);
     return NULL;
   }
-  if (AssignItem(self, index, NULL) < 0) {
+  if (AssignItem(pself, index, NULL) < 0) {
     return NULL;
   }
   return item;
 }
 
+static PyObject* ToStr(PyObject* pself) {
+  ScopedPyObjectPtr full_slice(PySlice_New(NULL, NULL, NULL));
+  if (full_slice == NULL) {
+    return NULL;
+  }
+  ScopedPyObjectPtr list(Subscript(pself, full_slice.get()));
+  if (list == NULL) {
+    return NULL;
+  }
+  return PyObject_Repr(list.get());
+}
+
 // The private constructor of RepeatedScalarContainer objects.
 PyObject *NewContainer(
     CMessage* parent, const FieldDescriptor* parent_field_descriptor) {
@@ -688,7 +695,8 @@
   if (full_slice == NULL) {
     return -1;
   }
-  ScopedPyObjectPtr values(Subscript(from, full_slice.get()));
+  ScopedPyObjectPtr values(
+      Subscript(reinterpret_cast<PyObject*>(from), full_slice.get()));
   if (values == NULL) {
     return -1;
   }
@@ -707,7 +715,10 @@
   return InitializeAndCopyToParentContainer(self, self);
 }
 
-PyObject* DeepCopy(RepeatedScalarContainer* self, PyObject* arg) {
+PyObject* DeepCopy(PyObject* pself, PyObject* arg) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
+
   RepeatedScalarContainer* clone = reinterpret_cast<RepeatedScalarContainer*>(
       PyType_GenericAlloc(&RepeatedScalarContainer_Type, 0));
   if (clone == NULL) {
@@ -721,45 +732,47 @@
   return reinterpret_cast<PyObject*>(clone);
 }
 
-static void Dealloc(RepeatedScalarContainer* self) {
+static void Dealloc(PyObject* pself) {
+  RepeatedScalarContainer* self =
+      reinterpret_cast<RepeatedScalarContainer*>(pself);
   self->owner.reset();
-  Py_TYPE(self)->tp_free(reinterpret_cast<PyObject*>(self));
+  Py_TYPE(self)->tp_free(pself);
 }
 
 void SetOwner(RepeatedScalarContainer* self,
-              const shared_ptr<Message>& new_owner) {
+              const CMessage::OwnerRef& new_owner) {
   self->owner = new_owner;
 }
 
 static PySequenceMethods SqMethods = {
-  (lenfunc)Len,           /* sq_length */
-  0, /* sq_concat */
-  0, /* sq_repeat */
-  (ssizeargfunc)Item, /* sq_item */
-  0, /* sq_slice */
-  (ssizeobjargproc)AssignItem /* sq_ass_item */
+  Len,        /* sq_length */
+  0,          /* sq_concat */
+  0,          /* sq_repeat */
+  Item,       /* sq_item */
+  0,          /* sq_slice */
+  AssignItem  /* sq_ass_item */
 };
 
 static PyMappingMethods MpMethods = {
-  (lenfunc)Len,               /* mp_length */
-  (binaryfunc)Subscript,      /* mp_subscript */
-  (objobjargproc)AssSubscript, /* mp_ass_subscript */
+  Len,               /* mp_length */
+  Subscript,      /* mp_subscript */
+  AssSubscript, /* mp_ass_subscript */
 };
 
 static PyMethodDef Methods[] = {
-  { "__deepcopy__", (PyCFunction)DeepCopy, METH_VARARGS,
+  { "__deepcopy__", DeepCopy, METH_VARARGS,
     "Makes a deep copy of the class." },
-  { "__reduce__", (PyCFunction)Reduce, METH_NOARGS,
+  { "__reduce__", Reduce, METH_NOARGS,
     "Outputs picklable representation of the repeated field." },
-  { "append", (PyCFunction)Append, METH_O,
+  { "append", AppendMethod, METH_O,
     "Appends an object to the repeated container." },
-  { "extend", (PyCFunction)Extend, METH_O,
+  { "extend", ExtendMethod, METH_O,
     "Appends objects to the repeated container." },
-  { "insert", (PyCFunction)Insert, METH_VARARGS,
-    "Appends objects to the repeated container." },
-  { "pop", (PyCFunction)Pop, METH_VARARGS,
+  { "insert", Insert, METH_VARARGS,
+    "Inserts an object at the specified position in the container." },
+  { "pop", Pop, METH_VARARGS,
     "Removes an object from the repeated container and returns it." },
-  { "remove", (PyCFunction)Remove, METH_O,
+  { "remove", Remove, METH_O,
     "Removes an object from the repeated container." },
   { "sort", (PyCFunction)Sort, METH_VARARGS | METH_KEYWORDS,
     "Sorts the repeated container."},
@@ -773,12 +786,12 @@
   FULL_MODULE_NAME ".RepeatedScalarContainer",  // tp_name
   sizeof(RepeatedScalarContainer),     // tp_basicsize
   0,                                   //  tp_itemsize
-  (destructor)repeated_scalar_container::Dealloc,  //  tp_dealloc
+  repeated_scalar_container::Dealloc,  //  tp_dealloc
   0,                                   //  tp_print
   0,                                   //  tp_getattr
   0,                                   //  tp_setattr
   0,                                   //  tp_compare
-  0,                                   //  tp_repr
+  repeated_scalar_container::ToStr,    //  tp_repr
   0,                                   //  tp_as_number
   &repeated_scalar_container::SqMethods,   //  tp_as_sequence
   &repeated_scalar_container::MpMethods,   //  tp_as_mapping
@@ -792,7 +805,7 @@
   "A Repeated scalar container",       //  tp_doc
   0,                                   //  tp_traverse
   0,                                   //  tp_clear
-  (richcmpfunc)repeated_scalar_container::RichCompare,  //  tp_richcompare
+  repeated_scalar_container::RichCompare,  //  tp_richcompare
   0,                                   //  tp_weaklistoffset
   0,                                   //  tp_iter
   0,                                   //  tp_iternext
diff --git a/python/google/protobuf/pyext/repeated_scalar_container.h b/python/google/protobuf/pyext/repeated_scalar_container.h
index 555e621..559dec9 100644
--- a/python/google/protobuf/pyext/repeated_scalar_container.h
+++ b/python/google/protobuf/pyext/repeated_scalar_container.h
@@ -37,27 +37,14 @@
 #include <Python.h>
 
 #include <memory>
-#ifndef _SHARED_PTR_H
-#include <google/protobuf/stubs/shared_ptr.h>
-#endif
 
 #include <google/protobuf/descriptor.h>
+#include <google/protobuf/pyext/message.h>
 
 namespace google {
 namespace protobuf {
-
-class Message;
-
-#ifdef _SHARED_PTR_H
-using std::shared_ptr;
-#else
-using internal::shared_ptr;
-#endif
-
 namespace python {
 
-struct CMessage;
-
 typedef struct RepeatedScalarContainer {
   PyObject_HEAD;
 
@@ -65,7 +52,7 @@
   // proto tree.  Every Python RepeatedScalarContainer holds a
   // reference to it in order to keep it alive as long as there's a
   // Python object that references any part of the tree.
-  shared_ptr<Message> owner;
+  CMessage::OwnerRef owner;
 
   // Pointer to the C++ Message that contains this container.  The
   // RepeatedScalarContainer does not own this pointer.
@@ -112,7 +99,7 @@
 
 // Set the owner field of self and any children of self.
 void SetOwner(RepeatedScalarContainer* self,
-              const shared_ptr<Message>& new_owner);
+              const CMessage::OwnerRef& new_owner);
 
 }  // namespace repeated_scalar_container
 }  // namespace python
diff --git a/python/google/protobuf/pyext/safe_numerics.h b/python/google/protobuf/pyext/safe_numerics.h
new file mode 100644
index 0000000..639ba2c
--- /dev/null
+++ b/python/google/protobuf/pyext/safe_numerics.h
@@ -0,0 +1,164 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+#ifndef GOOGLE_PROTOBUF_PYTHON_CPP_SAFE_NUMERICS_H__
+#define GOOGLE_PROTOBUF_PYTHON_CPP_SAFE_NUMERICS_H__
+// Copied from chromium with only changes to the namespace.
+
+#include <limits>
+
+#include <google/protobuf/stubs/logging.h>
+#include <google/protobuf/stubs/common.h>
+
+namespace google {
+namespace protobuf {
+namespace python {
+
+template <bool SameSize, bool DestLarger,
+          bool DestIsSigned, bool SourceIsSigned>
+struct IsValidNumericCastImpl;
+
+#define BASE_NUMERIC_CAST_CASE_SPECIALIZATION(A, B, C, D, Code) \
+template <> struct IsValidNumericCastImpl<A, B, C, D> { \
+  template <class Source, class DestBounds> static inline bool Test( \
+      Source source, DestBounds min, DestBounds max) { \
+    return Code; \
+  } \
+}
+
+#define BASE_NUMERIC_CAST_CASE_SAME_SIZE(DestSigned, SourceSigned, Code) \
+  BASE_NUMERIC_CAST_CASE_SPECIALIZATION( \
+      true, true, DestSigned, SourceSigned, Code); \
+  BASE_NUMERIC_CAST_CASE_SPECIALIZATION( \
+      true, false, DestSigned, SourceSigned, Code)
+
+#define BASE_NUMERIC_CAST_CASE_SOURCE_LARGER(DestSigned, SourceSigned, Code) \
+  BASE_NUMERIC_CAST_CASE_SPECIALIZATION( \
+      false, false, DestSigned, SourceSigned, Code); \
+
+#define BASE_NUMERIC_CAST_CASE_DEST_LARGER(DestSigned, SourceSigned, Code) \
+  BASE_NUMERIC_CAST_CASE_SPECIALIZATION( \
+      false, true, DestSigned, SourceSigned, Code); \
+
+// The three top level cases are:
+// - Same size
+// - Source larger
+// - Dest larger
+// And for each of those three cases, we handle the 4 different possibilities
+// of signed and unsigned. This gives 12 cases to handle, which we enumerate
+// below.
+//
+// The last argument in each of the macros is the actual comparison code. It
+// has three arguments available, source (the value), and min/max which are
+// the ranges of the destination.
+
+
+// These are the cases where both types have the same size.
+
+// Both signed.
+BASE_NUMERIC_CAST_CASE_SAME_SIZE(true, true, true);
+// Both unsigned.
+BASE_NUMERIC_CAST_CASE_SAME_SIZE(false, false, true);
+// Dest unsigned, Source signed.
+BASE_NUMERIC_CAST_CASE_SAME_SIZE(false, true, source >= 0);
+// Dest signed, Source unsigned.
+// This cast is OK because Dest's max must be less than Source's.
+BASE_NUMERIC_CAST_CASE_SAME_SIZE(true, false,
+                                 source <= static_cast<Source>(max));
+
+
+// These are the cases where Source is larger.
+
+// Both unsigned.
+BASE_NUMERIC_CAST_CASE_SOURCE_LARGER(false, false, source <= max);
+// Both signed.
+BASE_NUMERIC_CAST_CASE_SOURCE_LARGER(true, true,
+                                     source >= min && source <= max);
+// Dest is unsigned, Source is signed.
+BASE_NUMERIC_CAST_CASE_SOURCE_LARGER(false, true,
+                                     source >= 0 && source <= max);
+// Dest is signed, Source is unsigned.
+// This cast is OK because Dest's max must be less than Source's.
+BASE_NUMERIC_CAST_CASE_SOURCE_LARGER(true, false,
+                                     source <= static_cast<Source>(max));
+
+
+// These are the cases where Dest is larger.
+
+// Both unsigned.
+BASE_NUMERIC_CAST_CASE_DEST_LARGER(false, false, true);
+// Both signed.
+BASE_NUMERIC_CAST_CASE_DEST_LARGER(true, true, true);
+// Dest is unsigned, Source is signed.
+BASE_NUMERIC_CAST_CASE_DEST_LARGER(false, true, source >= 0);
+// Dest is signed, Source is unsigned.
+BASE_NUMERIC_CAST_CASE_DEST_LARGER(true, false, true);
+
+#undef BASE_NUMERIC_CAST_CASE_SPECIALIZATION
+#undef BASE_NUMERIC_CAST_CASE_SAME_SIZE
+#undef BASE_NUMERIC_CAST_CASE_SOURCE_LARGER
+#undef BASE_NUMERIC_CAST_CASE_DEST_LARGER
+
+
+// The main test for whether the conversion will under or overflow.
+template <class Dest, class Source>
+inline bool IsValidNumericCast(Source source) {
+  typedef std::numeric_limits<Source> SourceLimits;
+  typedef std::numeric_limits<Dest> DestLimits;
+  GOOGLE_COMPILE_ASSERT(SourceLimits::is_specialized, argument_must_be_numeric);
+  GOOGLE_COMPILE_ASSERT(SourceLimits::is_integer, argument_must_be_integral);
+  GOOGLE_COMPILE_ASSERT(DestLimits::is_specialized, result_must_be_numeric);
+  GOOGLE_COMPILE_ASSERT(DestLimits::is_integer, result_must_be_integral);
+
+  return IsValidNumericCastImpl<
+      sizeof(Dest) == sizeof(Source),
+      (sizeof(Dest) > sizeof(Source)),
+      DestLimits::is_signed,
+      SourceLimits::is_signed>::Test(
+          source,
+          DestLimits::min(),
+          DestLimits::max());
+}
+
+// checked_numeric_cast<> is analogous to static_cast<> for numeric types,
+// except that it CHECKs that the specified numeric conversion will not
+// overflow or underflow. Floating point arguments are not currently allowed
+// (this is COMPILE_ASSERTd), though this could be supported if necessary.
+template <class Dest, class Source>
+inline Dest checked_numeric_cast(Source source) {
+  GOOGLE_CHECK(IsValidNumericCast<Dest>(source));
+  return static_cast<Dest>(source);
+}
+
+}  // namespace python
+}  // namespace protobuf
+
+}  // namespace google
+#endif  // GOOGLE_PROTOBUF_PYTHON_CPP_SAFE_NUMERICS_H__
diff --git a/python/google/protobuf/pyext/scoped_pyobject_ptr.h b/python/google/protobuf/pyext/scoped_pyobject_ptr.h
index a128cd4..a2afa7f 100644
--- a/python/google/protobuf/pyext/scoped_pyobject_ptr.h
+++ b/python/google/protobuf/pyext/scoped_pyobject_ptr.h
@@ -36,61 +36,70 @@
 #include <google/protobuf/stubs/common.h>
 
 #include <Python.h>
-
 namespace google {
-class ScopedPyObjectPtr {
+namespace protobuf {
+namespace python {
+
+// Owns a python object and decrements the reference count on destruction.
+// This class is not threadsafe.
+template <typename PyObjectStruct>
+class ScopedPythonPtr {
  public:
-  // Constructor.  Defaults to initializing with NULL.
-  // There is no way to create an uninitialized ScopedPyObjectPtr.
-  explicit ScopedPyObjectPtr(PyObject* p = NULL) : ptr_(p) { }
+  // Takes the ownership of the specified object to ScopedPythonPtr.
+  // The reference count of the specified py_object is not incremented.
+  explicit ScopedPythonPtr(PyObjectStruct* py_object = NULL)
+      : ptr_(py_object) {}
 
-  // Destructor.  If there is a PyObject object, delete it.
-  ~ScopedPyObjectPtr() {
-    Py_XDECREF(ptr_);
-  }
+  // If a PyObject is owned, decrement its reference count.
+  ~ScopedPythonPtr() { Py_XDECREF(ptr_); }
 
-  // Reset.  Deletes the current owned object, if any.
-  // Then takes ownership of a new object, if given.
+  // Deletes the current owned object, if any.
+  // Then takes ownership of a new object without incrementing the reference
+  // count.
   // This function must be called with a reference that you own.
   //   this->reset(this->get()) is wrong!
   //   this->reset(this->release()) is OK.
-  PyObject* reset(PyObject* p = NULL) {
+  PyObjectStruct* reset(PyObjectStruct* p = NULL) {
     Py_XDECREF(ptr_);
     ptr_ = p;
     return ptr_;
   }
 
-  // Releases ownership of the object.
+  // Releases ownership of the object without decrementing the reference count.
   // The caller now owns the returned reference.
-  PyObject* release() {
+  PyObjectStruct* release() {
     PyObject* p = ptr_;
     ptr_ = NULL;
     return p;
   }
 
-  PyObject* operator->() const  {
+  PyObjectStruct* operator->() const {
     assert(ptr_ != NULL);
     return ptr_;
   }
 
-  PyObject* get() const { return ptr_; }
+  PyObjectStruct* get() const { return ptr_; }
 
-  Py_ssize_t refcnt() const { return Py_REFCNT(ptr_); }
+  PyObject* as_pyobject() const { return reinterpret_cast<PyObject*>(ptr_); }
 
+  // Increments the reference count fo the current object.
+  // Should not be called when no object is held.
   void inc() const { Py_INCREF(ptr_); }
 
-  // Comparison operators.
-  // These return whether a ScopedPyObjectPtr and a raw pointer
-  // refer to the same object, not just to two different but equal
-  // objects.
-  bool operator==(const PyObject* p) const { return ptr_ == p; }
-  bool operator!=(const PyObject* p) const { return ptr_ != p; }
+  // True when a ScopedPyObjectPtr and a raw pointer refer to the same object.
+  // Comparison operators are non reflexive.
+  bool operator==(const PyObjectStruct* p) const { return ptr_ == p; }
+  bool operator!=(const PyObjectStruct* p) const { return ptr_ != p; }
 
  private:
-  PyObject* ptr_;
+  PyObjectStruct* ptr_;
 
-  GOOGLE_DISALLOW_EVIL_CONSTRUCTORS(ScopedPyObjectPtr);
+  GOOGLE_DISALLOW_EVIL_CONSTRUCTORS(ScopedPythonPtr);
 };
 
+typedef ScopedPythonPtr<PyObject> ScopedPyObjectPtr;
+
+}  // namespace python
+}  // namespace protobuf
 }  // namespace google
 #endif  // GOOGLE_PROTOBUF_PYTHON_CPP_SCOPED_PYOBJECT_PTR_H__
diff --git a/python/google/protobuf/pyext/thread_unsafe_shared_ptr.h b/python/google/protobuf/pyext/thread_unsafe_shared_ptr.h
new file mode 100644
index 0000000..ad804b5
--- /dev/null
+++ b/python/google/protobuf/pyext/thread_unsafe_shared_ptr.h
@@ -0,0 +1,104 @@
+// Protocol Buffers - Google's data interchange format
+// Copyright 2008 Google Inc.  All rights reserved.
+// https://developers.google.com/protocol-buffers/
+//
+// Redistribution and use in source and binary forms, with or without
+// modification, are permitted provided that the following conditions are
+// met:
+//
+//     * Redistributions of source code must retain the above copyright
+// notice, this list of conditions and the following disclaimer.
+//     * Redistributions in binary form must reproduce the above
+// copyright notice, this list of conditions and the following disclaimer
+// in the documentation and/or other materials provided with the
+// distribution.
+//     * Neither the name of Google Inc. nor the names of its
+// contributors may be used to endorse or promote products derived from
+// this software without specific prior written permission.
+//
+// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+// ThreadUnsafeSharedPtr<T> is the same as shared_ptr<T> without the locking
+// overhread (and thread-safety).
+#ifndef GOOGLE_PROTOBUF_PYTHON_CPP_THREAD_UNSAFE_SHARED_PTR_H__
+#define GOOGLE_PROTOBUF_PYTHON_CPP_THREAD_UNSAFE_SHARED_PTR_H__
+
+#include <algorithm>
+#include <utility>
+
+#include <google/protobuf/stubs/logging.h>
+#include <google/protobuf/stubs/common.h>
+
+namespace google {
+namespace protobuf {
+namespace python {
+
+template <typename T>
+class ThreadUnsafeSharedPtr {
+ public:
+  // Takes ownership.
+  explicit ThreadUnsafeSharedPtr(T* ptr)
+      : ptr_(ptr), refcount_(ptr ? new RefcountT(1) : nullptr) {
+  }
+
+  ThreadUnsafeSharedPtr(const ThreadUnsafeSharedPtr& other)
+      : ThreadUnsafeSharedPtr(nullptr) {
+    *this = other;
+  }
+
+  ThreadUnsafeSharedPtr& operator=(const ThreadUnsafeSharedPtr& other) {
+    if (other.refcount_ == refcount_) {
+      return *this;
+    }
+    this->~ThreadUnsafeSharedPtr();
+    ptr_ = other.ptr_;
+    refcount_ = other.refcount_;
+    if (refcount_) {
+      ++*refcount_;
+    }
+    return *this;
+  }
+
+  ~ThreadUnsafeSharedPtr() {
+    if (refcount_ == nullptr) {
+      GOOGLE_DCHECK(ptr_ == nullptr);
+      return;
+    }
+    if (--*refcount_ == 0) {
+      delete refcount_;
+      delete ptr_;
+    }
+  }
+
+  void reset(T* ptr = nullptr) { *this = ThreadUnsafeSharedPtr(ptr); }
+
+  T* get() { return ptr_; }
+  const T* get() const { return ptr_; }
+
+  void swap(ThreadUnsafeSharedPtr& other) {
+    using std::swap;
+    swap(ptr_, other.ptr_);
+    swap(refcount_, other.refcount_);
+  }
+
+ private:
+  typedef int RefcountT;
+  T* ptr_;
+  RefcountT* refcount_;
+};
+
+}  // namespace python
+}  // namespace protobuf
+
+}  // namespace google
+#endif  // GOOGLE_PROTOBUF_PYTHON_CPP_THREAD_UNSAFE_SHARED_PTR_H__
diff --git a/python/google/protobuf/pyext/python_protobuf.h b/python/google/protobuf/python_protobuf.h
similarity index 100%
rename from python/google/protobuf/pyext/python_protobuf.h
rename to python/google/protobuf/python_protobuf.h
diff --git a/python/google/protobuf/reflection.py b/python/google/protobuf/reflection.py
index 0c75726..f4ce8ca 100755
--- a/python/google/protobuf/reflection.py
+++ b/python/google/protobuf/reflection.py
@@ -58,15 +58,11 @@
   from google.protobuf.internal import python_message as message_impl
 
 # The type of all Message classes.
-# Part of the public interface.
-#
-# Used by generated files, but clients can also use it at runtime:
-#   mydescriptor = pool.FindDescriptor(.....)
-#   class MyProtoClass(Message):
-#     __metaclass__ = GeneratedProtocolMessageType
-#     DESCRIPTOR = mydescriptor
+# Part of the public interface, but normally only used by message factories.
 GeneratedProtocolMessageType = message_impl.GeneratedProtocolMessageType
 
+MESSAGE_CLASS_CACHE = {}
+
 
 def ParseMessage(descriptor, byte_str):
   """Generate a new Message instance from this Descriptor and a byte string.
@@ -110,11 +106,16 @@
   Returns:
     The Message class object described by the descriptor.
   """
+  if descriptor in MESSAGE_CLASS_CACHE:
+    return MESSAGE_CLASS_CACHE[descriptor]
+
   attributes = {}
   for name, nested_type in descriptor.nested_types_by_name.items():
     attributes[name] = MakeClass(nested_type)
 
   attributes[GeneratedProtocolMessageType._DESCRIPTOR_KEY] = descriptor
 
-  return GeneratedProtocolMessageType(str(descriptor.name), (message.Message,),
-                                      attributes)
+  result = GeneratedProtocolMessageType(
+      str(descriptor.name), (message.Message,), attributes)
+  MESSAGE_CLASS_CACHE[descriptor] = result
+  return result
diff --git a/python/google/protobuf/symbol_database.py b/python/google/protobuf/symbol_database.py
index 87760f2..5ad869f 100644
--- a/python/google/protobuf/symbol_database.py
+++ b/python/google/protobuf/symbol_database.py
@@ -30,11 +30,9 @@
 
 """A database of Python protocol buffer generated symbols.
 
-SymbolDatabase makes it easy to create new instances of a registered type, given
-only the type's protocol buffer symbol name. Once all symbols are registered,
-they can be accessed using either the MessageFactory interface which
-SymbolDatabase exposes, or the DescriptorPool interface of the underlying
-pool.
+SymbolDatabase is the MessageFactory for messages generated at compile time,
+and makes it easy to create new instances of a registered type, given only the
+type's protocol buffer symbol name.
 
 Example usage:
 
@@ -61,27 +59,17 @@
 
 
 from google.protobuf import descriptor_pool
+from google.protobuf import message_factory
 
 
-class SymbolDatabase(object):
-  """A database of Python generated symbols.
-
-  SymbolDatabase also models message_factory.MessageFactory.
-
-  The symbol database can be used to keep a global registry of all protocol
-  buffer types used within a program.
-  """
-
-  def __init__(self, pool=None):
-    """Constructor."""
-
-    self._symbols = {}
-    self._symbols_by_file = {}
-    self.pool = pool or descriptor_pool.Default()
+class SymbolDatabase(message_factory.MessageFactory):
+  """A database of Python generated symbols."""
 
   def RegisterMessage(self, message):
     """Registers the given message type in the local database.
 
+    Calls to GetSymbol() and GetMessages() will return messages registered here.
+
     Args:
       message: a message.Message, to be registered.
 
@@ -90,13 +78,18 @@
     """
 
     desc = message.DESCRIPTOR
-    self._symbols[desc.full_name] = message
-    if desc.file.name not in self._symbols_by_file:
-      self._symbols_by_file[desc.file.name] = {}
-    self._symbols_by_file[desc.file.name][desc.full_name] = message
-    self.pool.AddDescriptor(desc)
+    self._classes[desc] = message
+    self.RegisterMessageDescriptor(desc)
     return message
 
+  def RegisterMessageDescriptor(self, message_descriptor):
+    """Registers the given message descriptor in the local database.
+
+    Args:
+      message_descriptor: a descriptor.MessageDescriptor.
+    """
+    self.pool.AddDescriptor(message_descriptor)
+
   def RegisterEnumDescriptor(self, enum_descriptor):
     """Registers the given enum descriptor in the local database.
 
@@ -109,6 +102,17 @@
     self.pool.AddEnumDescriptor(enum_descriptor)
     return enum_descriptor
 
+  def RegisterServiceDescriptor(self, service_descriptor):
+    """Registers the given service descriptor in the local database.
+
+    Args:
+      service_descriptor: a descriptor.ServiceDescriptor.
+
+    Returns:
+      The provided descriptor.
+    """
+    self.pool.AddServiceDescriptor(service_descriptor)
+
   def RegisterFileDescriptor(self, file_descriptor):
     """Registers the given file descriptor in the local database.
 
@@ -136,47 +140,47 @@
       KeyError: if the symbol could not be found.
     """
 
-    return self._symbols[symbol]
-
-  def GetPrototype(self, descriptor):
-    """Builds a proto2 message class based on the passed in descriptor.
-
-    Passing a descriptor with a fully qualified name matching a previous
-    invocation will cause the same class to be returned.
-
-    Args:
-      descriptor: The descriptor to build from.
-
-    Returns:
-      A class describing the passed in descriptor.
-    """
-
-    return self.GetSymbol(descriptor.full_name)
+    return self._classes[self.pool.FindMessageTypeByName(symbol)]
 
   def GetMessages(self, files):
-    """Gets all the messages from a specified file.
+    # TODO(amauryfa): Fix the differences with MessageFactory.
+    """Gets all registered messages from a specified file.
 
-    This will find and resolve dependencies, failing if they are not registered
-    in the symbol database.
-
+    Only messages already created and registered will be returned; (this is the
+    case for imported _pb2 modules)
+    But unlike MessageFactory, this version also returns already defined nested
+    messages, but does not register any message extensions.
 
     Args:
       files: The file names to extract messages from.
 
     Returns:
-      A dictionary mapping proto names to the message classes. This will include
-      any dependent messages as well as any messages defined in the same file as
-      a specified message.
+      A dictionary mapping proto names to the message classes.
 
     Raises:
       KeyError: if a file could not be found.
     """
 
+    def _GetAllMessages(desc):
+      """Walk a message Descriptor and recursively yields all message names."""
+      yield desc
+      for msg_desc in desc.nested_types:
+        for nested_desc in _GetAllMessages(msg_desc):
+          yield nested_desc
+
     result = {}
-    for f in files:
-      result.update(self._symbols_by_file[f])
+    for file_name in files:
+      file_desc = self.pool.FindFileByName(file_name)
+      for msg_desc in file_desc.message_types_by_name.values():
+        for desc in _GetAllMessages(msg_desc):
+          try:
+            result[desc.full_name] = self._classes[desc]
+          except KeyError:
+            # This descriptor has no registered class, skip it.
+            pass
     return result
 
+
 _DEFAULT = SymbolDatabase(pool=descriptor_pool.Default())
 
 
diff --git a/python/google/protobuf/text_format.py b/python/google/protobuf/text_format.py
index 8d25607..2cbd21b 100755
--- a/python/google/protobuf/text_format.py
+++ b/python/google/protobuf/text_format.py
@@ -48,15 +48,15 @@
 import six
 
 if six.PY3:
-  long = int
+  long = int  # pylint: disable=redefined-builtin,invalid-name
 
+# pylint: disable=g-import-not-at-top
 from google.protobuf.internal import type_checkers
 from google.protobuf import descriptor
 from google.protobuf import text_encoding
 
-__all__ = ['MessageToString', 'PrintMessage', 'PrintField',
-           'PrintFieldValue', 'Merge']
-
+__all__ = ['MessageToString', 'PrintMessage', 'PrintField', 'PrintFieldValue',
+           'Merge']
 
 _INTEGER_CHECKERS = (type_checkers.Uint32ValueChecker(),
                      type_checkers.Int32ValueChecker(),
@@ -67,6 +67,7 @@
 _FLOAT_TYPES = frozenset([descriptor.FieldDescriptor.CPPTYPE_FLOAT,
                           descriptor.FieldDescriptor.CPPTYPE_DOUBLE])
 _QUOTES = frozenset(("'", '"'))
+_ANY_FULL_TYPE_NAME = 'google.protobuf.Any'
 
 
 class Error(Exception):
@@ -74,10 +75,30 @@
 
 
 class ParseError(Error):
-  """Thrown in case of text parsing error."""
+  """Thrown in case of text parsing or tokenizing error."""
+
+  def __init__(self, message=None, line=None, column=None):
+    if message is not None and line is not None:
+      loc = str(line)
+      if column is not None:
+        loc += ':{0}'.format(column)
+      message = '{0} : {1}'.format(loc, message)
+    if message is not None:
+      super(ParseError, self).__init__(message)
+    else:
+      super(ParseError, self).__init__()
+    self._line = line
+    self._column = column
+
+  def GetLine(self):
+    return self._line
+
+  def GetColumn(self):
+    return self._column
 
 
 class TextWriter(object):
+
   def __init__(self, as_utf8):
     if six.PY2:
       self._writer = io.BytesIO()
@@ -97,9 +118,16 @@
     return self._writer.getvalue()
 
 
-def MessageToString(message, as_utf8=False, as_one_line=False,
-                    pointy_brackets=False, use_index_order=False,
-                    float_format=None):
+def MessageToString(message,
+                    as_utf8=False,
+                    as_one_line=False,
+                    pointy_brackets=False,
+                    use_index_order=False,
+                    float_format=None,
+                    use_field_number=False,
+                    descriptor_pool=None,
+                    indent=0,
+                    message_formatter=None):
   """Convert protobuf message to text format.
 
   Floating point values can be formatted compactly with 15 digits of
@@ -113,20 +141,28 @@
     as_one_line: Don't introduce newlines between fields.
     pointy_brackets: If True, use angle brackets instead of curly braces for
       nesting.
-    use_index_order: If True, print fields of a proto message using the order
-      defined in source code instead of the field number. By default, use the
-      field number order.
+    use_index_order: If True, fields of a proto message will be printed using
+      the order defined in source code instead of the field number, extensions
+      will be printed at the end of the message and their relative order is
+      determined by the extension number. By default, use the field number
+      order.
     float_format: If set, use this to specify floating point number formatting
       (per the "Format Specification Mini-Language"); otherwise, str() is used.
+    use_field_number: If True, print field numbers instead of names.
+    descriptor_pool: A DescriptorPool used to resolve Any types.
+    indent: The indent level, in terms of spaces, for pretty print.
+    message_formatter: A function(message, indent, as_one_line): unicode|None
+      to custom format selected sub-messages (usually based on message type).
+      Use to pretty print parts of the protobuf for easier diffing.
 
   Returns:
     A string of the text formatted protocol buffer message.
   """
   out = TextWriter(as_utf8)
-  PrintMessage(message, out, as_utf8=as_utf8, as_one_line=as_one_line,
-               pointy_brackets=pointy_brackets,
-               use_index_order=use_index_order,
-               float_format=float_format)
+  printer = _Printer(out, indent, as_utf8, as_one_line, pointy_brackets,
+                     use_index_order, float_format, use_field_number,
+                     descriptor_pool, message_formatter)
+  printer.PrintMessage(message)
   result = out.getvalue()
   out.close()
   if as_one_line:
@@ -140,142 +176,310 @@
           field.message_type.GetOptions().map_entry)
 
 
-def PrintMessage(message, out, indent=0, as_utf8=False, as_one_line=False,
-                 pointy_brackets=False, use_index_order=False,
-                 float_format=None):
-  fields = message.ListFields()
-  if use_index_order:
-    fields.sort(key=lambda x: x[0].index)
-  for field, value in fields:
-    if _IsMapEntry(field):
-      for key in sorted(value):
-        # This is slow for maps with submessage entires because it copies the
-        # entire tree.  Unfortunately this would take significant refactoring
-        # of this file to work around.
-        #
-        # TODO(haberman): refactor and optimize if this becomes an issue.
-        entry_submsg = field.message_type._concrete_class(
-            key=key, value=value[key])
-        PrintField(field, entry_submsg, out, indent, as_utf8, as_one_line,
-                   pointy_brackets=pointy_brackets,
-                   use_index_order=use_index_order, float_format=float_format)
-    elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
-      for element in value:
-        PrintField(field, element, out, indent, as_utf8, as_one_line,
-                   pointy_brackets=pointy_brackets,
-                   use_index_order=use_index_order,
-                   float_format=float_format)
-    else:
-      PrintField(field, value, out, indent, as_utf8, as_one_line,
-                 pointy_brackets=pointy_brackets,
-                 use_index_order=use_index_order,
-                 float_format=float_format)
+def PrintMessage(message,
+                 out,
+                 indent=0,
+                 as_utf8=False,
+                 as_one_line=False,
+                 pointy_brackets=False,
+                 use_index_order=False,
+                 float_format=None,
+                 use_field_number=False,
+                 descriptor_pool=None,
+                 message_formatter=None):
+  printer = _Printer(out, indent, as_utf8, as_one_line, pointy_brackets,
+                     use_index_order, float_format, use_field_number,
+                     descriptor_pool, message_formatter)
+  printer.PrintMessage(message)
 
 
-def PrintField(field, value, out, indent=0, as_utf8=False, as_one_line=False,
-               pointy_brackets=False, use_index_order=False, float_format=None):
-  """Print a single field name/value pair.  For repeated fields, the value
-  should be a single element.
-  """
-
-  out.write(' ' * indent)
-  if field.is_extension:
-    out.write('[')
-    if (field.containing_type.GetOptions().message_set_wire_format and
-        field.type == descriptor.FieldDescriptor.TYPE_MESSAGE and
-        field.label == descriptor.FieldDescriptor.LABEL_OPTIONAL):
-      out.write(field.message_type.full_name)
-    else:
-      out.write(field.full_name)
-    out.write(']')
-  elif field.type == descriptor.FieldDescriptor.TYPE_GROUP:
-    # For groups, use the capitalized name.
-    out.write(field.message_type.name)
-  else:
-    out.write(field.name)
-
-  if field.cpp_type != descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-    # The colon is optional in this case, but our cross-language golden files
-    # don't include it.
-    out.write(': ')
-
-  PrintFieldValue(field, value, out, indent, as_utf8, as_one_line,
-                  pointy_brackets=pointy_brackets,
-                  use_index_order=use_index_order,
-                  float_format=float_format)
-  if as_one_line:
-    out.write(' ')
-  else:
-    out.write('\n')
+def PrintField(field,
+               value,
+               out,
+               indent=0,
+               as_utf8=False,
+               as_one_line=False,
+               pointy_brackets=False,
+               use_index_order=False,
+               float_format=None,
+               message_formatter=None):
+  """Print a single field name/value pair."""
+  printer = _Printer(out, indent, as_utf8, as_one_line, pointy_brackets,
+                     use_index_order, float_format, message_formatter)
+  printer.PrintField(field, value)
 
 
-def PrintFieldValue(field, value, out, indent=0, as_utf8=False,
-                    as_one_line=False, pointy_brackets=False,
+def PrintFieldValue(field,
+                    value,
+                    out,
+                    indent=0,
+                    as_utf8=False,
+                    as_one_line=False,
+                    pointy_brackets=False,
                     use_index_order=False,
-                    float_format=None):
-  """Print a single field value (not including name).  For repeated fields,
-  the value should be a single element."""
+                    float_format=None,
+                    message_formatter=None):
+  """Print a single field value (not including name)."""
+  printer = _Printer(out, indent, as_utf8, as_one_line, pointy_brackets,
+                     use_index_order, float_format, message_formatter)
+  printer.PrintFieldValue(field, value)
 
-  if pointy_brackets:
-    openb = '<'
-    closeb = '>'
-  else:
-    openb = '{'
-    closeb = '}'
 
-  if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
-    if as_one_line:
-      out.write(' %s ' % openb)
-      PrintMessage(value, out, indent, as_utf8, as_one_line,
-                   pointy_brackets=pointy_brackets,
-                   use_index_order=use_index_order,
-                   float_format=float_format)
-      out.write(closeb)
+def _BuildMessageFromTypeName(type_name, descriptor_pool):
+  """Returns a protobuf message instance.
+
+  Args:
+    type_name: Fully-qualified protobuf  message type name string.
+    descriptor_pool: DescriptorPool instance.
+
+  Returns:
+    A Message instance of type matching type_name, or None if the a Descriptor
+    wasn't found matching type_name.
+  """
+  # pylint: disable=g-import-not-at-top
+  if descriptor_pool is None:
+    from google.protobuf import descriptor_pool as pool_mod
+    descriptor_pool = pool_mod.Default()
+  from google.protobuf import symbol_database
+  database = symbol_database.Default()
+  try:
+    message_descriptor = descriptor_pool.FindMessageTypeByName(type_name)
+  except KeyError:
+    return None
+  message_type = database.GetPrototype(message_descriptor)
+  return message_type()
+
+
+class _Printer(object):
+  """Text format printer for protocol message."""
+
+  def __init__(self,
+               out,
+               indent=0,
+               as_utf8=False,
+               as_one_line=False,
+               pointy_brackets=False,
+               use_index_order=False,
+               float_format=None,
+               use_field_number=False,
+               descriptor_pool=None,
+               message_formatter=None):
+    """Initialize the Printer.
+
+    Floating point values can be formatted compactly with 15 digits of
+    precision (which is the most that IEEE 754 "double" can guarantee)
+    using float_format='.15g'. To ensure that converting to text and back to a
+    proto will result in an identical value, float_format='.17g' should be used.
+
+    Args:
+      out: To record the text format result.
+      indent: The indent level for pretty print.
+      as_utf8: Produce text output in UTF8 format.
+      as_one_line: Don't introduce newlines between fields.
+      pointy_brackets: If True, use angle brackets instead of curly braces for
+        nesting.
+      use_index_order: If True, print fields of a proto message using the order
+        defined in source code instead of the field number. By default, use the
+        field number order.
+      float_format: If set, use this to specify floating point number formatting
+        (per the "Format Specification Mini-Language"); otherwise, str() is
+        used.
+      use_field_number: If True, print field numbers instead of names.
+      descriptor_pool: A DescriptorPool used to resolve Any types.
+      message_formatter: A function(message, indent, as_one_line): unicode|None
+        to custom format selected sub-messages (usually based on message type).
+        Use to pretty print parts of the protobuf for easier diffing.
+    """
+    self.out = out
+    self.indent = indent
+    self.as_utf8 = as_utf8
+    self.as_one_line = as_one_line
+    self.pointy_brackets = pointy_brackets
+    self.use_index_order = use_index_order
+    self.float_format = float_format
+    self.use_field_number = use_field_number
+    self.descriptor_pool = descriptor_pool
+    self.message_formatter = message_formatter
+
+  def _TryPrintAsAnyMessage(self, message):
+    """Serializes if message is a google.protobuf.Any field."""
+    packed_message = _BuildMessageFromTypeName(message.TypeName(),
+                                               self.descriptor_pool)
+    if packed_message:
+      packed_message.MergeFromString(message.value)
+      self.out.write('%s[%s]' % (self.indent * ' ', message.type_url))
+      self._PrintMessageFieldValue(packed_message)
+      self.out.write(' ' if self.as_one_line else '\n')
+      return True
     else:
-      out.write(' %s\n' % openb)
-      PrintMessage(value, out, indent + 2, as_utf8, as_one_line,
-                   pointy_brackets=pointy_brackets,
-                   use_index_order=use_index_order,
-                   float_format=float_format)
-      out.write(' ' * indent + closeb)
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
-    enum_value = field.enum_type.values_by_number.get(value, None)
-    if enum_value is not None:
-      out.write(enum_value.name)
+      return False
+
+  def _TryCustomFormatMessage(self, message):
+    formatted = self.message_formatter(message, self.indent, self.as_one_line)
+    if formatted is None:
+      return False
+
+    out = self.out
+    out.write(' ' * self.indent)
+    out.write(formatted)
+    out.write(' ' if self.as_one_line else '\n')
+    return True
+
+  def PrintMessage(self, message):
+    """Convert protobuf message to text format.
+
+    Args:
+      message: The protocol buffers message.
+    """
+    if self.message_formatter and self._TryCustomFormatMessage(message):
+      return
+    if (message.DESCRIPTOR.full_name == _ANY_FULL_TYPE_NAME and
+        self._TryPrintAsAnyMessage(message)):
+      return
+    fields = message.ListFields()
+    if self.use_index_order:
+      fields.sort(
+          key=lambda x: x[0].number if x[0].is_extension else x[0].index)
+    for field, value in fields:
+      if _IsMapEntry(field):
+        for key in sorted(value):
+          # This is slow for maps with submessage entries because it copies the
+          # entire tree.  Unfortunately this would take significant refactoring
+          # of this file to work around.
+          #
+          # TODO(haberman): refactor and optimize if this becomes an issue.
+          entry_submsg = value.GetEntryClass()(key=key, value=value[key])
+          self.PrintField(field, entry_submsg)
+      elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
+        for element in value:
+          self.PrintField(field, element)
+      else:
+        self.PrintField(field, value)
+
+  def PrintField(self, field, value):
+    """Print a single field name/value pair."""
+    out = self.out
+    out.write(' ' * self.indent)
+    if self.use_field_number:
+      out.write(str(field.number))
+    else:
+      if field.is_extension:
+        out.write('[')
+        if (field.containing_type.GetOptions().message_set_wire_format and
+            field.type == descriptor.FieldDescriptor.TYPE_MESSAGE and
+            field.label == descriptor.FieldDescriptor.LABEL_OPTIONAL):
+          out.write(field.message_type.full_name)
+        else:
+          out.write(field.full_name)
+        out.write(']')
+      elif field.type == descriptor.FieldDescriptor.TYPE_GROUP:
+        # For groups, use the capitalized name.
+        out.write(field.message_type.name)
+      else:
+        out.write(field.name)
+
+    if field.cpp_type != descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+      # The colon is optional in this case, but our cross-language golden files
+      # don't include it.
+      out.write(': ')
+
+    self.PrintFieldValue(field, value)
+    if self.as_one_line:
+      out.write(' ')
+    else:
+      out.write('\n')
+
+  def _PrintMessageFieldValue(self, value):
+    if self.pointy_brackets:
+      openb = '<'
+      closeb = '>'
+    else:
+      openb = '{'
+      closeb = '}'
+
+    if self.as_one_line:
+      self.out.write(' %s ' % openb)
+      self.PrintMessage(value)
+      self.out.write(closeb)
+    else:
+      self.out.write(' %s\n' % openb)
+      self.indent += 2
+      self.PrintMessage(value)
+      self.indent -= 2
+      self.out.write(' ' * self.indent + closeb)
+
+  def PrintFieldValue(self, field, value):
+    """Print a single field value (not including name).
+
+    For repeated fields, the value should be a single element.
+
+    Args:
+      field: The descriptor of the field to be printed.
+      value: The value of the field.
+    """
+    out = self.out
+    if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+      self._PrintMessageFieldValue(value)
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
+      enum_value = field.enum_type.values_by_number.get(value, None)
+      if enum_value is not None:
+        out.write(enum_value.name)
+      else:
+        out.write(str(value))
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
+      out.write('\"')
+      if isinstance(value, six.text_type):
+        out_value = value.encode('utf-8')
+      else:
+        out_value = value
+      if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
+        # We need to escape non-UTF8 chars in TYPE_BYTES field.
+        out_as_utf8 = False
+      else:
+        out_as_utf8 = self.as_utf8
+      out.write(text_encoding.CEscape(out_value, out_as_utf8))
+      out.write('\"')
+    elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
+      if value:
+        out.write('true')
+      else:
+        out.write('false')
+    elif field.cpp_type in _FLOAT_TYPES and self.float_format is not None:
+      out.write('{1:{0}}'.format(self.float_format, value))
     else:
       out.write(str(value))
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
-    out.write('\"')
-    if isinstance(value, six.text_type):
-      out_value = value.encode('utf-8')
-    else:
-      out_value = value
-    if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
-      # We need to escape non-UTF8 chars in TYPE_BYTES field.
-      out_as_utf8 = False
-    else:
-      out_as_utf8 = as_utf8
-    out.write(text_encoding.CEscape(out_value, out_as_utf8))
-    out.write('\"')
-  elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
-    if value:
-      out.write('true')
-    else:
-      out.write('false')
-  elif field.cpp_type in _FLOAT_TYPES and float_format is not None:
-    out.write('{1:{0}}'.format(float_format, value))
-  else:
-    out.write(str(value))
 
 
-def Parse(text, message, allow_unknown_extension=False):
-  """Parses an text representation of a protocol message into a message.
+def Parse(text,
+          message,
+          allow_unknown_extension=False,
+          allow_field_number=False,
+          descriptor_pool=None):
+  """Parses a text representation of a protocol message into a message.
+
+  NOTE: for historical reasons this function does not clear the input
+  message. This is different from what the binary msg.ParseFrom(...) does.
+
+  Example
+    a = MyProto()
+    a.repeated_field.append('test')
+    b = MyProto()
+
+    text_format.Parse(repr(a), b)
+    text_format.Parse(repr(a), b) # repeated_field contains ["test", "test"]
+
+    # Binary version:
+    b.ParseFromString(a.SerializeToString()) # repeated_field is now "test"
+
+  Caller is responsible for clearing the message as needed.
 
   Args:
     text: Message text representation.
     message: A protocol buffer message to merge into.
     allow_unknown_extension: if True, skip over missing extensions and keep
       parsing
+    allow_field_number: if True, both field number and field name are allowed.
+    descriptor_pool: A DescriptorPool used to resolve Any types.
 
   Returns:
     The same message passed as argument.
@@ -284,12 +488,23 @@
     ParseError: On text parsing problems.
   """
   if not isinstance(text, str):
-    text = text.decode('utf-8')
-  return ParseLines(text.split('\n'), message, allow_unknown_extension)
+    if six.PY3:
+      text = text.decode('utf-8')
+    else:
+      text = text.encode('utf-8')
+  return ParseLines(text.split('\n'),
+                    message,
+                    allow_unknown_extension,
+                    allow_field_number,
+                    descriptor_pool=descriptor_pool)
 
 
-def Merge(text, message, allow_unknown_extension=False):
-  """Parses an text representation of a protocol message into a message.
+def Merge(text,
+          message,
+          allow_unknown_extension=False,
+          allow_field_number=False,
+          descriptor_pool=None):
+  """Parses a text representation of a protocol message into a message.
 
   Like Parse(), but allows repeated values for a non-repeated field, and uses
   the last one.
@@ -299,6 +514,8 @@
     message: A protocol buffer message to merge into.
     allow_unknown_extension: if True, skip over missing extensions and keep
       parsing
+    allow_field_number: if True, both field number and field name are allowed.
+    descriptor_pool: A DescriptorPool used to resolve Any types.
 
   Returns:
     The same message passed as argument.
@@ -306,17 +523,33 @@
   Raises:
     ParseError: On text parsing problems.
   """
-  return MergeLines(text.split('\n'), message, allow_unknown_extension)
+  if not isinstance(text, str):
+    if six.PY3:
+      text = text.decode('utf-8')
+    else:
+      text = text.encode('utf-8')
+  return MergeLines(
+      text.split('\n'),
+      message,
+      allow_unknown_extension,
+      allow_field_number,
+      descriptor_pool=descriptor_pool)
 
 
-def ParseLines(lines, message, allow_unknown_extension=False):
-  """Parses an text representation of a protocol message into a message.
+def ParseLines(lines,
+               message,
+               allow_unknown_extension=False,
+               allow_field_number=False,
+               descriptor_pool=None):
+  """Parses a text representation of a protocol message into a message.
 
   Args:
     lines: An iterable of lines of a message's text representation.
     message: A protocol buffer message to merge into.
     allow_unknown_extension: if True, skip over missing extensions and keep
       parsing
+    allow_field_number: if True, both field number and field name are allowed.
+    descriptor_pool: A DescriptorPool used to resolve Any types.
 
   Returns:
     The same message passed as argument.
@@ -324,18 +557,26 @@
   Raises:
     ParseError: On text parsing problems.
   """
-  _ParseOrMerge(lines, message, False, allow_unknown_extension)
-  return message
+  parser = _Parser(allow_unknown_extension,
+                   allow_field_number,
+                   descriptor_pool=descriptor_pool)
+  return parser.ParseLines(lines, message)
 
 
-def MergeLines(lines, message, allow_unknown_extension=False):
-  """Parses an text representation of a protocol message into a message.
+def MergeLines(lines,
+               message,
+               allow_unknown_extension=False,
+               allow_field_number=False,
+               descriptor_pool=None):
+  """Parses a text representation of a protocol message into a message.
 
   Args:
     lines: An iterable of lines of a message's text representation.
     message: A protocol buffer message to merge into.
     allow_unknown_extension: if True, skip over missing extensions and keep
       parsing
+    allow_field_number: if True, both field number and field name are allowed.
+    descriptor_pool: A DescriptorPool used to resolve Any types.
 
   Returns:
     The same message passed as argument.
@@ -343,108 +584,220 @@
   Raises:
     ParseError: On text parsing problems.
   """
-  _ParseOrMerge(lines, message, True, allow_unknown_extension)
-  return message
+  parser = _Parser(allow_unknown_extension,
+                   allow_field_number,
+                   descriptor_pool=descriptor_pool)
+  return parser.MergeLines(lines, message)
 
 
-def _ParseOrMerge(lines,
-                  message,
-                  allow_multiple_scalars,
-                  allow_unknown_extension=False):
-  """Converts an text representation of a protocol message into a message.
+class _Parser(object):
+  """Text format parser for protocol message."""
 
-  Args:
-    lines: Lines of a message's text representation.
-    message: A protocol buffer message to merge into.
-    allow_multiple_scalars: Determines if repeated values for a non-repeated
-      field are permitted, e.g., the string "foo: 1 foo: 2" for a
-      required/optional field named "foo".
-    allow_unknown_extension: if True, skip over missing extensions and keep
-      parsing
+  def __init__(self,
+               allow_unknown_extension=False,
+               allow_field_number=False,
+               descriptor_pool=None):
+    self.allow_unknown_extension = allow_unknown_extension
+    self.allow_field_number = allow_field_number
+    self.descriptor_pool = descriptor_pool
 
-  Raises:
-    ParseError: On text parsing problems.
-  """
-  tokenizer = _Tokenizer(lines)
-  while not tokenizer.AtEnd():
-    _MergeField(tokenizer, message, allow_multiple_scalars,
-                allow_unknown_extension)
+  def ParseFromString(self, text, message):
+    """Parses a text representation of a protocol message into a message."""
+    if not isinstance(text, str):
+      text = text.decode('utf-8')
+    return self.ParseLines(text.split('\n'), message)
 
+  def ParseLines(self, lines, message):
+    """Parses a text representation of a protocol message into a message."""
+    self._allow_multiple_scalars = False
+    self._ParseOrMerge(lines, message)
+    return message
 
-def _MergeField(tokenizer,
-                message,
-                allow_multiple_scalars,
-                allow_unknown_extension=False):
-  """Merges a single protocol message field into a message.
+  def MergeFromString(self, text, message):
+    """Merges a text representation of a protocol message into a message."""
+    return self._MergeLines(text.split('\n'), message)
 
-  Args:
-    tokenizer: A tokenizer to parse the field name and values.
-    message: A protocol message to record the data.
-    allow_multiple_scalars: Determines if repeated values for a non-repeated
-      field are permitted, e.g., the string "foo: 1 foo: 2" for a
-      required/optional field named "foo".
-    allow_unknown_extension: if True, skip over missing extensions and keep
-      parsing
+  def MergeLines(self, lines, message):
+    """Merges a text representation of a protocol message into a message."""
+    self._allow_multiple_scalars = True
+    self._ParseOrMerge(lines, message)
+    return message
 
-  Raises:
-    ParseError: In case of text parsing problems.
-  """
-  message_descriptor = message.DESCRIPTOR
-  if (hasattr(message_descriptor, 'syntax') and
-      message_descriptor.syntax == 'proto3'):
-    # Proto3 doesn't represent presence so we can't test if multiple
-    # scalars have occurred.  We have to allow them.
-    allow_multiple_scalars = True
-  if tokenizer.TryConsume('['):
+  def _ParseOrMerge(self, lines, message):
+    """Converts a text representation of a protocol message into a message.
+
+    Args:
+      lines: Lines of a message's text representation.
+      message: A protocol buffer message to merge into.
+
+    Raises:
+      ParseError: On text parsing problems.
+    """
+    tokenizer = Tokenizer(lines)
+    while not tokenizer.AtEnd():
+      self._MergeField(tokenizer, message)
+
+  def _MergeField(self, tokenizer, message):
+    """Merges a single protocol message field into a message.
+
+    Args:
+      tokenizer: A tokenizer to parse the field name and values.
+      message: A protocol message to record the data.
+
+    Raises:
+      ParseError: In case of text parsing problems.
+    """
+    message_descriptor = message.DESCRIPTOR
+    if (message_descriptor.full_name == _ANY_FULL_TYPE_NAME and
+        tokenizer.TryConsume('[')):
+      type_url_prefix, packed_type_name = self._ConsumeAnyTypeUrl(tokenizer)
+      tokenizer.Consume(']')
+      tokenizer.TryConsume(':')
+      if tokenizer.TryConsume('<'):
+        expanded_any_end_token = '>'
+      else:
+        tokenizer.Consume('{')
+        expanded_any_end_token = '}'
+      expanded_any_sub_message = _BuildMessageFromTypeName(packed_type_name,
+                                                           self.descriptor_pool)
+      if not expanded_any_sub_message:
+        raise ParseError('Type %s not found in descriptor pool' %
+                         packed_type_name)
+      while not tokenizer.TryConsume(expanded_any_end_token):
+        if tokenizer.AtEnd():
+          raise tokenizer.ParseErrorPreviousToken('Expected "%s".' %
+                                                  (expanded_any_end_token,))
+        self._MergeField(tokenizer, expanded_any_sub_message)
+      message.Pack(expanded_any_sub_message,
+                   type_url_prefix=type_url_prefix)
+      return
+
+    if tokenizer.TryConsume('['):
+      name = [tokenizer.ConsumeIdentifier()]
+      while tokenizer.TryConsume('.'):
+        name.append(tokenizer.ConsumeIdentifier())
+      name = '.'.join(name)
+
+      if not message_descriptor.is_extendable:
+        raise tokenizer.ParseErrorPreviousToken(
+            'Message type "%s" does not have extensions.' %
+            message_descriptor.full_name)
+      # pylint: disable=protected-access
+      field = message.Extensions._FindExtensionByName(name)
+      # pylint: enable=protected-access
+      if not field:
+        if self.allow_unknown_extension:
+          field = None
+        else:
+          raise tokenizer.ParseErrorPreviousToken(
+              'Extension "%s" not registered. '
+              'Did you import the _pb2 module which defines it? '
+              'If you are trying to place the extension in the MessageSet '
+              'field of another message that is in an Any or MessageSet field, '
+              'that message\'s _pb2 module must be imported as well' % name)
+      elif message_descriptor != field.containing_type:
+        raise tokenizer.ParseErrorPreviousToken(
+            'Extension "%s" does not extend message type "%s".' %
+            (name, message_descriptor.full_name))
+
+      tokenizer.Consume(']')
+
+    else:
+      name = tokenizer.ConsumeIdentifierOrNumber()
+      if self.allow_field_number and name.isdigit():
+        number = ParseInteger(name, True, True)
+        field = message_descriptor.fields_by_number.get(number, None)
+        if not field and message_descriptor.is_extendable:
+          field = message.Extensions._FindExtensionByNumber(number)
+      else:
+        field = message_descriptor.fields_by_name.get(name, None)
+
+        # Group names are expected to be capitalized as they appear in the
+        # .proto file, which actually matches their type names, not their field
+        # names.
+        if not field:
+          field = message_descriptor.fields_by_name.get(name.lower(), None)
+          if field and field.type != descriptor.FieldDescriptor.TYPE_GROUP:
+            field = None
+
+        if (field and field.type == descriptor.FieldDescriptor.TYPE_GROUP and
+            field.message_type.name != name):
+          field = None
+
+      if not field:
+        raise tokenizer.ParseErrorPreviousToken(
+            'Message type "%s" has no field named "%s".' %
+            (message_descriptor.full_name, name))
+
+    if field:
+      if not self._allow_multiple_scalars and field.containing_oneof:
+        # Check if there's a different field set in this oneof.
+        # Note that we ignore the case if the same field was set before, and we
+        # apply _allow_multiple_scalars to non-scalar fields as well.
+        which_oneof = message.WhichOneof(field.containing_oneof.name)
+        if which_oneof is not None and which_oneof != field.name:
+          raise tokenizer.ParseErrorPreviousToken(
+              'Field "%s" is specified along with field "%s", another member '
+              'of oneof "%s" for message type "%s".' %
+              (field.name, which_oneof, field.containing_oneof.name,
+               message_descriptor.full_name))
+
+      if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+        tokenizer.TryConsume(':')
+        merger = self._MergeMessageField
+      else:
+        tokenizer.Consume(':')
+        merger = self._MergeScalarField
+
+      if (field.label == descriptor.FieldDescriptor.LABEL_REPEATED and
+          tokenizer.TryConsume('[')):
+        # Short repeated format, e.g. "foo: [1, 2, 3]"
+        if not tokenizer.TryConsume(']'):
+          while True:
+            merger(tokenizer, message, field)
+            if tokenizer.TryConsume(']'):
+              break
+            tokenizer.Consume(',')
+
+      else:
+        merger(tokenizer, message, field)
+
+    else:  # Proto field is unknown.
+      assert self.allow_unknown_extension
+      _SkipFieldContents(tokenizer)
+
+    # For historical reasons, fields may optionally be separated by commas or
+    # semicolons.
+    if not tokenizer.TryConsume(','):
+      tokenizer.TryConsume(';')
+
+  def _ConsumeAnyTypeUrl(self, tokenizer):
+    """Consumes a google.protobuf.Any type URL and returns the type name."""
+    # Consume "type.googleapis.com/".
+    prefix = [tokenizer.ConsumeIdentifier()]
+    tokenizer.Consume('.')
+    prefix.append(tokenizer.ConsumeIdentifier())
+    tokenizer.Consume('.')
+    prefix.append(tokenizer.ConsumeIdentifier())
+    tokenizer.Consume('/')
+    # Consume the fully-qualified type name.
     name = [tokenizer.ConsumeIdentifier()]
     while tokenizer.TryConsume('.'):
       name.append(tokenizer.ConsumeIdentifier())
-    name = '.'.join(name)
+    return '.'.join(prefix), '.'.join(name)
 
-    if not message_descriptor.is_extendable:
-      raise tokenizer.ParseErrorPreviousToken(
-          'Message type "%s" does not have extensions.' %
-          message_descriptor.full_name)
-    # pylint: disable=protected-access
-    field = message.Extensions._FindExtensionByName(name)
-    # pylint: enable=protected-access
-    if not field:
-      if allow_unknown_extension:
-        field = None
-      else:
-        raise tokenizer.ParseErrorPreviousToken(
-            'Extension "%s" not registered.' % name)
-    elif message_descriptor != field.containing_type:
-      raise tokenizer.ParseErrorPreviousToken(
-          'Extension "%s" does not extend message type "%s".' % (
-              name, message_descriptor.full_name))
+  def _MergeMessageField(self, tokenizer, message, field):
+    """Merges a single scalar field into a message.
 
-    tokenizer.Consume(']')
+    Args:
+      tokenizer: A tokenizer to parse the field value.
+      message: The message of which field is a member.
+      field: The descriptor of the field to be merged.
 
-  else:
-    name = tokenizer.ConsumeIdentifier()
-    field = message_descriptor.fields_by_name.get(name, None)
-
-    # Group names are expected to be capitalized as they appear in the
-    # .proto file, which actually matches their type names, not their field
-    # names.
-    if not field:
-      field = message_descriptor.fields_by_name.get(name.lower(), None)
-      if field and field.type != descriptor.FieldDescriptor.TYPE_GROUP:
-        field = None
-
-    if (field and field.type == descriptor.FieldDescriptor.TYPE_GROUP and
-        field.message_type.name != name):
-      field = None
-
-    if not field:
-      raise tokenizer.ParseErrorPreviousToken(
-          'Message type "%s" has no field named "%s".' % (
-              message_descriptor.full_name, name))
-
-  if field and field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
+    Raises:
+      ParseError: In case of text parsing problems.
+    """
     is_map_entry = _IsMapEntry(field)
-    tokenizer.TryConsume(':')
 
     if tokenizer.TryConsume('<'):
       end_token = '>'
@@ -456,21 +809,32 @@
       if field.is_extension:
         sub_message = message.Extensions[field].add()
       elif is_map_entry:
-        sub_message = field.message_type._concrete_class()
+        sub_message = getattr(message, field.name).GetEntryClass()()
       else:
         sub_message = getattr(message, field.name).add()
     else:
       if field.is_extension:
+        if (not self._allow_multiple_scalars and
+            message.HasExtension(field)):
+          raise tokenizer.ParseErrorPreviousToken(
+              'Message type "%s" should not have multiple "%s" extensions.' %
+              (message.DESCRIPTOR.full_name, field.full_name))
         sub_message = message.Extensions[field]
       else:
+        # Also apply _allow_multiple_scalars to message field.
+        # TODO(jieluo): Change to _allow_singular_overwrites.
+        if (not self._allow_multiple_scalars and
+            message.HasField(field.name)):
+          raise tokenizer.ParseErrorPreviousToken(
+              'Message type "%s" should not have multiple "%s" fields.' %
+              (message.DESCRIPTOR.full_name, field.name))
         sub_message = getattr(message, field.name)
       sub_message.SetInParent()
 
     while not tokenizer.TryConsume(end_token):
       if tokenizer.AtEnd():
-        raise tokenizer.ParseErrorPreviousToken('Expected "%s".' % (end_token))
-      _MergeField(tokenizer, sub_message, allow_multiple_scalars,
-                  allow_unknown_extension)
+        raise tokenizer.ParseErrorPreviousToken('Expected "%s".' % (end_token,))
+      self._MergeField(tokenizer, sub_message)
 
     if is_map_entry:
       value_cpptype = field.message_type.fields_by_name['value'].cpp_type
@@ -479,26 +843,81 @@
         value.MergeFrom(sub_message.value)
       else:
         getattr(message, field.name)[sub_message.key] = sub_message.value
-  elif field:
-    tokenizer.Consume(':')
-    if (field.label == descriptor.FieldDescriptor.LABEL_REPEATED and
-        tokenizer.TryConsume('[')):
-      # Short repeated format, e.g. "foo: [1, 2, 3]"
-      while True:
-        _MergeScalarField(tokenizer, message, field, allow_multiple_scalars)
-        if tokenizer.TryConsume(']'):
-          break
-        tokenizer.Consume(',')
-    else:
-      _MergeScalarField(tokenizer, message, field, allow_multiple_scalars)
-  else:  # Proto field is unknown.
-    assert allow_unknown_extension
-    _SkipFieldContents(tokenizer)
 
-  # For historical reasons, fields may optionally be separated by commas or
-  # semicolons.
-  if not tokenizer.TryConsume(','):
-    tokenizer.TryConsume(';')
+  @staticmethod
+  def _IsProto3Syntax(message):
+    message_descriptor = message.DESCRIPTOR
+    return (hasattr(message_descriptor, 'syntax') and
+            message_descriptor.syntax == 'proto3')
+
+  def _MergeScalarField(self, tokenizer, message, field):
+    """Merges a single scalar field into a message.
+
+    Args:
+      tokenizer: A tokenizer to parse the field value.
+      message: A protocol message to record the data.
+      field: The descriptor of the field to be merged.
+
+    Raises:
+      ParseError: In case of text parsing problems.
+      RuntimeError: On runtime errors.
+    """
+    _ = self.allow_unknown_extension
+    value = None
+
+    if field.type in (descriptor.FieldDescriptor.TYPE_INT32,
+                      descriptor.FieldDescriptor.TYPE_SINT32,
+                      descriptor.FieldDescriptor.TYPE_SFIXED32):
+      value = _ConsumeInt32(tokenizer)
+    elif field.type in (descriptor.FieldDescriptor.TYPE_INT64,
+                        descriptor.FieldDescriptor.TYPE_SINT64,
+                        descriptor.FieldDescriptor.TYPE_SFIXED64):
+      value = _ConsumeInt64(tokenizer)
+    elif field.type in (descriptor.FieldDescriptor.TYPE_UINT32,
+                        descriptor.FieldDescriptor.TYPE_FIXED32):
+      value = _ConsumeUint32(tokenizer)
+    elif field.type in (descriptor.FieldDescriptor.TYPE_UINT64,
+                        descriptor.FieldDescriptor.TYPE_FIXED64):
+      value = _ConsumeUint64(tokenizer)
+    elif field.type in (descriptor.FieldDescriptor.TYPE_FLOAT,
+                        descriptor.FieldDescriptor.TYPE_DOUBLE):
+      value = tokenizer.ConsumeFloat()
+    elif field.type == descriptor.FieldDescriptor.TYPE_BOOL:
+      value = tokenizer.ConsumeBool()
+    elif field.type == descriptor.FieldDescriptor.TYPE_STRING:
+      value = tokenizer.ConsumeString()
+    elif field.type == descriptor.FieldDescriptor.TYPE_BYTES:
+      value = tokenizer.ConsumeByteString()
+    elif field.type == descriptor.FieldDescriptor.TYPE_ENUM:
+      value = tokenizer.ConsumeEnum(field)
+    else:
+      raise RuntimeError('Unknown field type %d' % field.type)
+
+    if field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
+      if field.is_extension:
+        message.Extensions[field].append(value)
+      else:
+        getattr(message, field.name).append(value)
+    else:
+      # Proto3 doesn't represent presence so we can't test if multiple scalars
+      # have occurred. We have to allow them.
+      can_check_presence = not self._IsProto3Syntax(message)
+      if field.is_extension:
+        if (not self._allow_multiple_scalars and can_check_presence and
+            message.HasExtension(field)):
+          raise tokenizer.ParseErrorPreviousToken(
+              'Message type "%s" should not have multiple "%s" extensions.' %
+              (message.DESCRIPTOR.full_name, field.full_name))
+        else:
+          message.Extensions[field] = value
+      else:
+        if (not self._allow_multiple_scalars and can_check_presence and
+            message.HasField(field.name)):
+          raise tokenizer.ParseErrorPreviousToken(
+              'Message type "%s" should not have multiple "%s" fields.' %
+              (message.DESCRIPTOR.full_name, field.name))
+        else:
+          setattr(message, field.name, value)
 
 
 def _SkipFieldContents(tokenizer):
@@ -533,7 +952,7 @@
       tokenizer.ConsumeIdentifier()
     tokenizer.Consume(']')
   else:
-    tokenizer.ConsumeIdentifier()
+    tokenizer.ConsumeIdentifierOrNumber()
 
   _SkipFieldContents(tokenizer)
 
@@ -571,88 +990,20 @@
   Raises:
     ParseError: In case an invalid field value is found.
   """
-  # String tokens can come in multiple adjacent string literals.
+  # String/bytes tokens can come in multiple adjacent string literals.
   # If we can consume one, consume as many as we can.
-  if tokenizer.TryConsumeString():
-    while tokenizer.TryConsumeString():
+  if tokenizer.TryConsumeByteString():
+    while tokenizer.TryConsumeByteString():
       pass
     return
 
   if (not tokenizer.TryConsumeIdentifier() and
-      not tokenizer.TryConsumeInt64() and
-      not tokenizer.TryConsumeUint64() and
+      not _TryConsumeInt64(tokenizer) and not _TryConsumeUint64(tokenizer) and
       not tokenizer.TryConsumeFloat()):
     raise ParseError('Invalid field value: ' + tokenizer.token)
 
 
-def _MergeScalarField(tokenizer, message, field, allow_multiple_scalars):
-  """Merges a single protocol message scalar field into a message.
-
-  Args:
-    tokenizer: A tokenizer to parse the field value.
-    message: A protocol message to record the data.
-    field: The descriptor of the field to be merged.
-    allow_multiple_scalars: Determines if repeated values for a non-repeated
-      field are permitted, e.g., the string "foo: 1 foo: 2" for a
-      required/optional field named "foo".
-
-  Raises:
-    ParseError: In case of text parsing problems.
-    RuntimeError: On runtime errors.
-  """
-  value = None
-
-  if field.type in (descriptor.FieldDescriptor.TYPE_INT32,
-                    descriptor.FieldDescriptor.TYPE_SINT32,
-                    descriptor.FieldDescriptor.TYPE_SFIXED32):
-    value = tokenizer.ConsumeInt32()
-  elif field.type in (descriptor.FieldDescriptor.TYPE_INT64,
-                      descriptor.FieldDescriptor.TYPE_SINT64,
-                      descriptor.FieldDescriptor.TYPE_SFIXED64):
-    value = tokenizer.ConsumeInt64()
-  elif field.type in (descriptor.FieldDescriptor.TYPE_UINT32,
-                      descriptor.FieldDescriptor.TYPE_FIXED32):
-    value = tokenizer.ConsumeUint32()
-  elif field.type in (descriptor.FieldDescriptor.TYPE_UINT64,
-                      descriptor.FieldDescriptor.TYPE_FIXED64):
-    value = tokenizer.ConsumeUint64()
-  elif field.type in (descriptor.FieldDescriptor.TYPE_FLOAT,
-                      descriptor.FieldDescriptor.TYPE_DOUBLE):
-    value = tokenizer.ConsumeFloat()
-  elif field.type == descriptor.FieldDescriptor.TYPE_BOOL:
-    value = tokenizer.ConsumeBool()
-  elif field.type == descriptor.FieldDescriptor.TYPE_STRING:
-    value = tokenizer.ConsumeString()
-  elif field.type == descriptor.FieldDescriptor.TYPE_BYTES:
-    value = tokenizer.ConsumeByteString()
-  elif field.type == descriptor.FieldDescriptor.TYPE_ENUM:
-    value = tokenizer.ConsumeEnum(field)
-  else:
-    raise RuntimeError('Unknown field type %d' % field.type)
-
-  if field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
-    if field.is_extension:
-      message.Extensions[field].append(value)
-    else:
-      getattr(message, field.name).append(value)
-  else:
-    if field.is_extension:
-      if not allow_multiple_scalars and message.HasExtension(field):
-        raise tokenizer.ParseErrorPreviousToken(
-            'Message type "%s" should not have multiple "%s" extensions.' %
-            (message.DESCRIPTOR.full_name, field.full_name))
-      else:
-        message.Extensions[field] = value
-    else:
-      if not allow_multiple_scalars and message.HasField(field.name):
-        raise tokenizer.ParseErrorPreviousToken(
-            'Message type "%s" should not have multiple "%s" fields.' %
-            (message.DESCRIPTOR.full_name, field.name))
-      else:
-        setattr(message, field.name, value)
-
-
-class _Tokenizer(object):
+class Tokenizer(object):
   """Protocol buffer text representation tokenizer.
 
   This class handles the lower level string parsing by splitting it into
@@ -661,17 +1012,20 @@
   It was directly ported from the Java protocol buffer API.
   """
 
-  _WHITESPACE = re.compile('(\\s|(#.*$))+', re.MULTILINE)
+  _WHITESPACE = re.compile(r'\s+')
+  _COMMENT = re.compile(r'(\s*#.*$)', re.MULTILINE)
+  _WHITESPACE_OR_COMMENT = re.compile(r'(\s|(#.*$))+', re.MULTILINE)
   _TOKEN = re.compile('|'.join([
-      r'[a-zA-Z_][0-9a-zA-Z_+-]*',             # an identifier
+      r'[a-zA-Z_][0-9a-zA-Z_+-]*',  # an identifier
       r'([0-9+-]|(\.[0-9]))[0-9a-zA-Z_.+-]*',  # a number
-  ] + [                                        # quoted str for each quote mark
+  ] + [  # quoted str for each quote mark
       r'{qt}([^{qt}\n\\]|\\.)*({qt}|\\?$)'.format(qt=mark) for mark in _QUOTES
   ]))
 
-  _IDENTIFIER = re.compile(r'\w+')
+  _IDENTIFIER = re.compile(r'[^\d\W]\w*')
+  _IDENTIFIER_OR_NUMBER = re.compile(r'\w+')
 
-  def __init__(self, lines):
+  def __init__(self, lines, skip_comments=True):
     self._position = 0
     self._line = -1
     self._column = 0
@@ -682,6 +1036,9 @@
     self._previous_line = 0
     self._previous_column = 0
     self._more_lines = True
+    self._skip_comments = skip_comments
+    self._whitespace_pattern = (skip_comments and self._WHITESPACE_OR_COMMENT
+                                or self._WHITESPACE)
     self._SkipWhitespace()
     self.NextToken()
 
@@ -711,7 +1068,7 @@
   def _SkipWhitespace(self):
     while True:
       self._PopLine()
-      match = self._WHITESPACE.match(self._current_line, self._column)
+      match = self._whitespace_pattern.match(self._current_line, self._column)
       if not match:
         break
       length = len(match.group(0))
@@ -741,7 +1098,30 @@
       ParseError: If the text couldn't be consumed.
     """
     if not self.TryConsume(token):
-      raise self._ParseError('Expected "%s".' % token)
+      raise self.ParseError('Expected "%s".' % token)
+
+  def ConsumeComment(self):
+    result = self.token
+    if not self._COMMENT.match(result):
+      raise self.ParseError('Expected comment.')
+    self.NextToken()
+    return result
+
+  def ConsumeCommentOrTrailingComment(self):
+    """Consumes a comment, returns a 2-tuple (trailing bool, comment str)."""
+
+    # Tokenizer initializes _previous_line and _previous_column to 0. As the
+    # tokenizer starts, it looks like there is a previous token on the line.
+    just_started = self._line == 0 and self._column == 0
+
+    before_parsing = self._previous_line
+    comment = self.ConsumeComment()
+
+    # A trailing comment is a comment on the same line than the previous token.
+    trailing = (self._previous_line == before_parsing
+                and not just_started)
+
+    return trailing, comment
 
   def TryConsumeIdentifier(self):
     try:
@@ -761,85 +1141,55 @@
     """
     result = self.token
     if not self._IDENTIFIER.match(result):
-      raise self._ParseError('Expected identifier.')
+      raise self.ParseError('Expected identifier.')
     self.NextToken()
     return result
 
-  def ConsumeInt32(self):
-    """Consumes a signed 32bit integer number.
-
-    Returns:
-      The integer parsed.
-
-    Raises:
-      ParseError: If a signed 32bit integer couldn't be consumed.
-    """
+  def TryConsumeIdentifierOrNumber(self):
     try:
-      result = ParseInteger(self.token, is_signed=True, is_long=False)
-    except ValueError as e:
-      raise self._ParseError(str(e))
-    self.NextToken()
-    return result
-
-  def ConsumeUint32(self):
-    """Consumes an unsigned 32bit integer number.
-
-    Returns:
-      The integer parsed.
-
-    Raises:
-      ParseError: If an unsigned 32bit integer couldn't be consumed.
-    """
-    try:
-      result = ParseInteger(self.token, is_signed=False, is_long=False)
-    except ValueError as e:
-      raise self._ParseError(str(e))
-    self.NextToken()
-    return result
-
-  def TryConsumeInt64(self):
-    try:
-      self.ConsumeInt64()
+      self.ConsumeIdentifierOrNumber()
       return True
     except ParseError:
       return False
 
-  def ConsumeInt64(self):
-    """Consumes a signed 64bit integer number.
+  def ConsumeIdentifierOrNumber(self):
+    """Consumes protocol message field identifier.
 
     Returns:
-      The integer parsed.
+      Identifier string.
 
     Raises:
-      ParseError: If a signed 64bit integer couldn't be consumed.
+      ParseError: If an identifier couldn't be consumed.
     """
-    try:
-      result = ParseInteger(self.token, is_signed=True, is_long=True)
-    except ValueError as e:
-      raise self._ParseError(str(e))
+    result = self.token
+    if not self._IDENTIFIER_OR_NUMBER.match(result):
+      raise self.ParseError('Expected identifier or number, got %s.' % result)
     self.NextToken()
     return result
 
-  def TryConsumeUint64(self):
+  def TryConsumeInteger(self):
     try:
-      self.ConsumeUint64()
+      # Note: is_long only affects value type, not whether an error is raised.
+      self.ConsumeInteger()
       return True
     except ParseError:
       return False
 
-  def ConsumeUint64(self):
-    """Consumes an unsigned 64bit integer number.
+  def ConsumeInteger(self, is_long=False):
+    """Consumes an integer number.
 
+    Args:
+      is_long: True if the value should be returned as a long integer.
     Returns:
       The integer parsed.
 
     Raises:
-      ParseError: If an unsigned 64bit integer couldn't be consumed.
+      ParseError: If an integer couldn't be consumed.
     """
     try:
-      result = ParseInteger(self.token, is_signed=False, is_long=True)
+      result = _ParseAbstractInteger(self.token, is_long=is_long)
     except ValueError as e:
-      raise self._ParseError(str(e))
+      raise self.ParseError(str(e))
     self.NextToken()
     return result
 
@@ -862,7 +1212,7 @@
     try:
       result = ParseFloat(self.token)
     except ValueError as e:
-      raise self._ParseError(str(e))
+      raise self.ParseError(str(e))
     self.NextToken()
     return result
 
@@ -878,13 +1228,13 @@
     try:
       result = ParseBool(self.token)
     except ValueError as e:
-      raise self._ParseError(str(e))
+      raise self.ParseError(str(e))
     self.NextToken()
     return result
 
-  def TryConsumeString(self):
+  def TryConsumeByteString(self):
     try:
-      self.ConsumeString()
+      self.ConsumeByteString()
       return True
     except ParseError:
       return False
@@ -932,15 +1282,15 @@
     """
     text = self.token
     if len(text) < 1 or text[0] not in _QUOTES:
-      raise self._ParseError('Expected string but found: %r' % (text,))
+      raise self.ParseError('Expected string but found: %r' % (text,))
 
     if len(text) < 2 or text[-1] != text[0]:
-      raise self._ParseError('String missing ending quote: %r' % (text,))
+      raise self.ParseError('String missing ending quote: %r' % (text,))
 
     try:
       result = text_encoding.CUnescape(text[1:-1])
     except ValueError as e:
-      raise self._ParseError(str(e))
+      raise self.ParseError(str(e))
     self.NextToken()
     return result
 
@@ -948,7 +1298,7 @@
     try:
       result = ParseEnum(field, self.token)
     except ValueError as e:
-      raise self._ParseError(str(e))
+      raise self.ParseError(str(e))
     self.NextToken()
     return result
 
@@ -961,16 +1311,15 @@
     Returns:
       A ParseError instance.
     """
-    return ParseError('%d:%d : %s' % (
-        self._previous_line + 1, self._previous_column + 1, message))
+    return ParseError(message, self._previous_line + 1,
+                      self._previous_column + 1)
 
-  def _ParseError(self, message):
+  def ParseError(self, message):
     """Creates and *returns* a ParseError for the current token."""
-    return ParseError('%d:%d : %s' % (
-        self._line + 1, self._column + 1, message))
+    return ParseError(message, self._line + 1, self._column + 1)
 
   def _StringParseError(self, e):
-    return self._ParseError('Couldn\'t parse string: ' + str(e))
+    return self.ParseError('Couldn\'t parse string: ' + str(e))
 
   def NextToken(self):
     """Reads the next meaningful token."""
@@ -985,12 +1334,124 @@
       return
 
     match = self._TOKEN.match(self._current_line, self._column)
+    if not match and not self._skip_comments:
+      match = self._COMMENT.match(self._current_line, self._column)
     if match:
       token = match.group(0)
       self.token = token
     else:
       self.token = self._current_line[self._column]
 
+# Aliased so it can still be accessed by current visibility violators.
+# TODO(dbarnett): Migrate violators to textformat_tokenizer.
+_Tokenizer = Tokenizer  # pylint: disable=invalid-name
+
+
+def _ConsumeInt32(tokenizer):
+  """Consumes a signed 32bit integer number from tokenizer.
+
+  Args:
+    tokenizer: A tokenizer used to parse the number.
+
+  Returns:
+    The integer parsed.
+
+  Raises:
+    ParseError: If a signed 32bit integer couldn't be consumed.
+  """
+  return _ConsumeInteger(tokenizer, is_signed=True, is_long=False)
+
+
+def _ConsumeUint32(tokenizer):
+  """Consumes an unsigned 32bit integer number from tokenizer.
+
+  Args:
+    tokenizer: A tokenizer used to parse the number.
+
+  Returns:
+    The integer parsed.
+
+  Raises:
+    ParseError: If an unsigned 32bit integer couldn't be consumed.
+  """
+  return _ConsumeInteger(tokenizer, is_signed=False, is_long=False)
+
+
+def _TryConsumeInt64(tokenizer):
+  try:
+    _ConsumeInt64(tokenizer)
+    return True
+  except ParseError:
+    return False
+
+
+def _ConsumeInt64(tokenizer):
+  """Consumes a signed 32bit integer number from tokenizer.
+
+  Args:
+    tokenizer: A tokenizer used to parse the number.
+
+  Returns:
+    The integer parsed.
+
+  Raises:
+    ParseError: If a signed 32bit integer couldn't be consumed.
+  """
+  return _ConsumeInteger(tokenizer, is_signed=True, is_long=True)
+
+
+def _TryConsumeUint64(tokenizer):
+  try:
+    _ConsumeUint64(tokenizer)
+    return True
+  except ParseError:
+    return False
+
+
+def _ConsumeUint64(tokenizer):
+  """Consumes an unsigned 64bit integer number from tokenizer.
+
+  Args:
+    tokenizer: A tokenizer used to parse the number.
+
+  Returns:
+    The integer parsed.
+
+  Raises:
+    ParseError: If an unsigned 64bit integer couldn't be consumed.
+  """
+  return _ConsumeInteger(tokenizer, is_signed=False, is_long=True)
+
+
+def _TryConsumeInteger(tokenizer, is_signed=False, is_long=False):
+  try:
+    _ConsumeInteger(tokenizer, is_signed=is_signed, is_long=is_long)
+    return True
+  except ParseError:
+    return False
+
+
+def _ConsumeInteger(tokenizer, is_signed=False, is_long=False):
+  """Consumes an integer number from tokenizer.
+
+  Args:
+    tokenizer: A tokenizer used to parse the number.
+    is_signed: True if a signed integer must be parsed.
+    is_long: True if a long integer must be parsed.
+
+  Returns:
+    The integer parsed.
+
+  Raises:
+    ParseError: If an integer with given characteristics couldn't be consumed.
+  """
+  try:
+    result = ParseInteger(tokenizer.token, is_signed=is_signed, is_long=is_long)
+  except ValueError as e:
+    raise tokenizer.ParseError(str(e))
+  tokenizer.NextToken()
+  return result
+
 
 def ParseInteger(text, is_signed=False, is_long=False):
   """Parses an integer.
@@ -1007,16 +1468,7 @@
     ValueError: Thrown Iff the text is not a valid integer.
   """
   # Do the actual parsing. Exception handling is propagated to caller.
-  try:
-    # We force 32-bit values to int and 64-bit values to long to make
-    # alternate implementations where the distinction is more significant
-    # (e.g. the C++ implementation) simpler.
-    if is_long:
-      result = long(text, 0)
-    else:
-      result = int(text, 0)
-  except ValueError:
-    raise ValueError('Couldn\'t parse integer: %s' % text)
+  result = _ParseAbstractInteger(text, is_long=is_long)
 
   # Check if the integer is sane. Exceptions handled by callers.
   checker = _INTEGER_CHECKERS[2 * int(is_long) + int(is_signed)]
@@ -1024,6 +1476,32 @@
   return result
 
 
+def _ParseAbstractInteger(text, is_long=False):
+  """Parses an integer without checking size/signedness.
+
+  Args:
+    text: The text to parse.
+    is_long: True if the value should be returned as a long integer.
+
+  Returns:
+    The integer value.
+
+  Raises:
+    ValueError: Thrown Iff the text is not a valid integer.
+  """
+  # Do the actual parsing. Exception handling is propagated to caller.
+  try:
+    # We force 32-bit values to int and 64-bit values to long to make
+    # alternate implementations where the distinction is more significant
+    # (e.g. the C++ implementation) simpler.
+    if is_long:
+      return long(text, 0)
+    else:
+      return int(text, 0)
+  except ValueError:
+    raise ValueError('Couldn\'t parse integer: %s' % text)
+
+
 def ParseFloat(text):
   """Parse a floating point number.
 
@@ -1068,9 +1546,9 @@
   Raises:
     ValueError: If text is not a valid boolean.
   """
-  if text in ('true', 't', '1'):
+  if text in ('true', 't', '1', 'True'):
     return True
-  elif text in ('false', 'f', '0'):
+  elif text in ('false', 'f', '0', 'False'):
     return False
   else:
     raise ValueError('Expected "true" or "false".')
@@ -1099,14 +1577,17 @@
     # Identifier.
     enum_value = enum_descriptor.values_by_name.get(value, None)
     if enum_value is None:
-      raise ValueError(
-          'Enum type "%s" has no value named %s.' % (
-              enum_descriptor.full_name, value))
+      raise ValueError('Enum type "%s" has no value named %s.' %
+                       (enum_descriptor.full_name, value))
   else:
     # Numeric value.
+    if hasattr(field.file, 'syntax'):
+      # Attribute is checked for compatibility.
+      if field.file.syntax == 'proto3':
+        # Proto3 accept numeric unknown enums.
+        return number
     enum_value = enum_descriptor.values_by_number.get(number, None)
     if enum_value is None:
-      raise ValueError(
-          'Enum type "%s" has no value with number %d.' % (
-              enum_descriptor.full_name, number))
+      raise ValueError('Enum type "%s" has no value with number %d.' %
+                       (enum_descriptor.full_name, number))
   return enum_value.number
diff --git a/python/google/protobuf/util/__init__.py b/python/google/protobuf/util/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/python/google/protobuf/util/__init__.py
diff --git a/python/mox.py b/python/mox.py
index 257468e..43db021 100755
--- a/python/mox.py
+++ b/python/mox.py
@@ -778,7 +778,7 @@
       rhs: any python object
     """
 
-    raise NotImplementedError, 'method must be implemented by a subclass.'
+    raise NotImplementedError('method must be implemented by a subclass.')
 
   def __eq__(self, rhs):
     return self.equals(rhs)
diff --git a/python/release.sh b/python/release.sh
new file mode 100755
index 0000000..a71cc7f
--- /dev/null
+++ b/python/release.sh
@@ -0,0 +1,116 @@
+#!/bin/bash
+
+set -ex
+
+function get_source_version() {
+  grep "__version__ = '.*'" python/google/protobuf/__init__.py | sed -r "s/__version__ = '(.*)'/\1/"
+}
+
+function run_install_test() {
+  local VERSION=$1
+  local PYTHON=$2
+  local PYPI=$3
+
+  virtualenv --no-site-packages -p `which $PYTHON` test-venv
+
+  # Intentionally put a broken protoc in the path to make sure installation
+  # doesn't require protoc installed.
+  touch test-venv/bin/protoc
+  chmod +x test-venv/bin/protoc
+
+  source test-venv/bin/activate
+  pip install -i ${PYPI} protobuf==${VERSION} --no-cache-dir
+  deactivate
+  rm -fr test-venv
+}
+
+
+[ $# -lt 1 ] && {
+  echo "Usage: $0 VERSION ["
+  echo ""
+  echo "Examples:"
+  echo "  Test 3.3.0 release using version number 3.3.0.dev1:"
+  echo "    $0 3.0.0 dev1"
+  echo "  Actually release 3.3.0 to PyPI:"
+  echo "    $0 3.3.0"
+  exit 1
+}
+VERSION=$1
+DEV=$2
+
+# Make sure we are in a protobuf source tree.
+[ -f "python/google/protobuf/__init__.py" ] || {
+  echo "This script must be ran under root of protobuf source tree."
+  exit 1
+}
+
+# Make sure all files are world-readable.
+find python -type d -exec chmod a+r,a+x {} +
+find python -type f -exec chmod a+r {} +
+umask 0022
+
+# Check that the supplied version number matches what's inside the source code.
+SOURCE_VERSION=`get_source_version`
+
+[ "${VERSION}" == "${SOURCE_VERSION}" -o "${VERSION}.${DEV}" == "${SOURCE_VERSION}" ] || {
+  echo "Version number specified on the command line ${VERSION} doesn't match"
+  echo "the actual version number in the source code: ${SOURCE_VERSION}"
+  exit 1
+}
+
+TESTING_ONLY=1
+TESTING_VERSION=${VERSION}.${DEV}
+if [ -z "${DEV}" ]; then
+  read -p "You are releasing ${VERSION} to PyPI. Are you sure? [y/n]" -r
+  echo
+  if [[ ! $REPLY =~ ^[Yy]$ ]]; then
+    exit 1
+  fi
+  TESTING_ONLY=0
+  TESTING_VERSION=${VERSION}
+else
+  # Use dev version number for testing.
+  sed -i -r "s/__version__ = '.*'/__version__ = '${VERSION}.${DEV}'/" python/google/protobuf/__init__.py
+fi
+
+cd python
+
+# Run tests locally.
+python setup.py build
+python setup.py test
+
+# Deploy source package to testing PyPI
+python setup.py sdist upload -r https://test.pypi.org/legacy/
+
+# Test locally with different python versions.
+run_install_test ${TESTING_VERSION} python2.7 https://test.pypi.org/simple
+run_install_test ${TESTING_VERSION} python3.4 https://test.pypi.org/simple
+
+# Deploy egg/wheel packages to testing PyPI and test again.
+python setup.py bdist_egg bdist_wheel upload -r https://test.pypi.org/legacy/
+
+run_install_test ${TESTING_VERSION} python2.7 https://test.pypi.org/simple
+run_install_test ${TESTING_VERSION} python3.4 https://test.pypi.org/simple
+
+echo "All install tests have passed using testing PyPI."
+
+if [ $TESTING_ONLY -eq 0 ]; then
+  read -p "Publish to PyPI? [y/n]" -r
+  echo
+  if [[ ! $REPLY =~ ^[Yy]$ ]]; then
+    exit 1
+  fi
+  echo "Publishing to PyPI..."
+  # Be sure to run build before sdist, because otherwise sdist will not include
+  # well-known types.
+  python setup.py clean build sdist upload
+  # Be sure to run clean before bdist_xxx, because otherwise bdist_xxx will
+  # include files you may not want in the package. E.g., if you have built
+  # and tested with --cpp_implemenation, bdist_xxx will include the _message.so
+  # file even when you no longer pass the --cpp_implemenation flag. See:
+  #   https://github.com/google/protobuf/issues/3042
+  python setup.py clean build bdist_egg bdist_wheel upload
+else
+  # Set the version number back (i.e., remove dev suffix).
+  sed -i -r "s/__version__ = '.*'/__version__ = '${VERSION}'/" google/protobuf/__init__.py
+fi
diff --git a/python/release/wheel/Dockerfile b/python/release/wheel/Dockerfile
new file mode 100644
index 0000000..f38ec2f
--- /dev/null
+++ b/python/release/wheel/Dockerfile
@@ -0,0 +1,6 @@
+FROM quay.io/pypa/manylinux1_x86_64
+
+RUN yum install -y libtool
+RUN /opt/python/cp27-cp27mu/bin/pip install twine
+
+COPY protobuf_optimized_pip.sh /
diff --git a/python/release/wheel/README.md b/python/release/wheel/README.md
new file mode 100644
index 0000000..edda2cd
--- /dev/null
+++ b/python/release/wheel/README.md
@@ -0,0 +1,17 @@
+Description
+------------------------------
+This directory is used to build released wheels according to PEP513 and upload
+them to pypi.
+
+Usage
+------------------------------
+For example, to release 3.3.0:
+    ./protobuf_optimized_pip.sh 3.3.0 PYPI_USERNAME PYPI_PASSWORD
+
+Structure
+------------------------------
+| Source                    | Source                                                       |
+|--------------------------------------|---------------------------------------------------|
+| protobuf_optimized_pip.sh | Entry point. Calling Dockerfile and build_wheel_manylinux.sh |
+| Dockerfile                | Build docker image according to PEP513.                      |
+| build_wheel_manylinux.sh  | Build wheel packages in the docker container.                |
diff --git a/python/release/wheel/build_wheel_manylinux.sh b/python/release/wheel/build_wheel_manylinux.sh
new file mode 100755
index 0000000..39fd8c1
--- /dev/null
+++ b/python/release/wheel/build_wheel_manylinux.sh
@@ -0,0 +1,27 @@
+#!/bin/bash
+
+# Print usage and fail.
+function usage() {
+  echo "Usage: protobuf_optimized_pip.sh PROTOBUF_VERSION PYPI_USERNAME PYPI_PASSWORD" >&2
+  exit 1   # Causes caller to exit because we use -e.
+}
+
+# Validate arguments.
+if [ $0 != ./build_wheel_manylinux.sh ]; then
+  echo "Please run this script from the directory in which it is located." >&2
+  exit 1
+fi
+
+if [ $# -lt 3 ]; then
+  usage
+  exit 1
+fi
+
+PROTOBUF_VERSION=$1
+PYPI_USERNAME=$2
+PYPI_PASSWORD=$3
+
+docker rmi protobuf-python-wheel
+docker build . -t protobuf-python-wheel
+docker run --rm protobuf-python-wheel ./protobuf_optimized_pip.sh $PROTOBUF_VERSION $PYPI_USERNAME $PYPI_PASSWORD
+docker rmi protobuf-python-wheel
diff --git a/python/release/wheel/protobuf_optimized_pip.sh b/python/release/wheel/protobuf_optimized_pip.sh
new file mode 100755
index 0000000..98306f4
--- /dev/null
+++ b/python/release/wheel/protobuf_optimized_pip.sh
@@ -0,0 +1,66 @@
+#!/usr/bin/env bash
+
+# DO NOT use this script manually! Called by docker.
+
+set -ex
+
+# Print usage and fail.
+function usage() {
+  echo "Usage: protobuf_optimized_pip.sh PROTOBUF_VERSION" >&2
+  exit 1   # Causes caller to exit because we use -e.
+}
+
+# Build wheel
+function build_wheel() {
+  PYTHON_VERSION=$1
+  PYTHON_BIN=/opt/python/${PYTHON_VERSION}/bin/python
+
+  $PYTHON_BIN setup.py bdist_wheel --cpp_implementation --compile_static_extension
+  auditwheel repair dist/protobuf-${PROTOBUF_VERSION}-${PYTHON_VERSION}-linux_x86_64.whl
+}
+
+# Validate arguments.
+if [ $0 != ./protobuf_optimized_pip.sh ]; then
+  echo "Please run this script from the directory in which it is located." >&2
+  exit 1
+fi
+
+if [ $# -lt 1 ]; then
+  usage
+  exit 1
+fi
+
+PROTOBUF_VERSION=$1
+PYPI_USERNAME=$2
+PYPI_PASSWORD=$3
+
+DIR=${PWD}/'protobuf-python-build'
+PYTHON_VERSIONS=('cp27-cp27mu' 'cp33-cp33m' 'cp34-cp34m' 'cp35-cp35m' 'cp36-cp36m')
+
+mkdir -p ${DIR}
+cd ${DIR}
+curl -SsL -O https://github.com/google/protobuf/archive/v${PROTOBUF_VERSION}.tar.gz
+tar xzf v${PROTOBUF_VERSION}.tar.gz
+cd $DIR/protobuf-${PROTOBUF_VERSION}
+
+# Autoconf on centos 5.11 cannot recognize AC_PROG_OBJC.
+sed -i '/AC_PROG_OBJC/d' configure.ac
+sed -i 's/conformance\/Makefile//g' configure.ac
+
+# Use the /usr/bin/autoconf and related tools to pick the correct aclocal macros
+export PATH="/usr/bin:$PATH"
+
+# Build protoc
+./autogen.sh
+CXXFLAGS="-fPIC -g -O2" ./configure
+make -j8
+export PROTOC=$DIR/src/protoc
+
+cd python
+
+for PYTHON_VERSION in "${PYTHON_VERSIONS[@]}"
+do
+  build_wheel $PYTHON_VERSION
+done
+
+/opt/python/cp27-cp27mu/bin/twine upload wheelhouse/*
diff --git a/python/setup.cfg b/python/setup.cfg
new file mode 100644
index 0000000..2a9acf1
--- /dev/null
+++ b/python/setup.cfg
@@ -0,0 +1,2 @@
+[bdist_wheel]
+universal = 1
diff --git a/python/setup.py b/python/setup.py
index 6ea3bad..a9df075 100755
--- a/python/setup.py
+++ b/python/setup.py
@@ -5,6 +5,7 @@
 import os
 import subprocess
 import sys
+import platform
 
 # We must use setuptools, not distutils, because we need to use the
 # namespace_packages option for the "google" package.
@@ -76,7 +77,11 @@
       sys.exit(-1)
 
 def GenerateUnittestProtos():
+  generate_proto("../src/google/protobuf/any_test.proto", False)
+  generate_proto("../src/google/protobuf/map_proto2_unittest.proto", False)
   generate_proto("../src/google/protobuf/map_unittest.proto", False)
+  generate_proto("../src/google/protobuf/test_messages_proto3.proto", False)
+  generate_proto("../src/google/protobuf/test_messages_proto2.proto", False)
   generate_proto("../src/google/protobuf/unittest_arena.proto", False)
   generate_proto("../src/google/protobuf/unittest_no_arena.proto", False)
   generate_proto("../src/google/protobuf/unittest_no_arena_import.proto", False)
@@ -94,6 +99,7 @@
   generate_proto("google/protobuf/internal/descriptor_pool_test2.proto", False)
   generate_proto("google/protobuf/internal/factory_test1.proto", False)
   generate_proto("google/protobuf/internal/factory_test2.proto", False)
+  generate_proto("google/protobuf/internal/file_options_test.proto", False)
   generate_proto("google/protobuf/internal/import_test_package/inner.proto", False)
   generate_proto("google/protobuf/internal/import_test_package/outer.proto", False)
   generate_proto("google/protobuf/internal/missing_enum_values.proto", False)
@@ -101,6 +107,7 @@
   generate_proto("google/protobuf/internal/more_extensions.proto", False)
   generate_proto("google/protobuf/internal/more_extensions_dynamic.proto", False)
   generate_proto("google/protobuf/internal/more_messages.proto", False)
+  generate_proto("google/protobuf/internal/no_package.proto", False)
   generate_proto("google/protobuf/internal/packed_field_test.proto", False)
   generate_proto("google/protobuf/internal/test_bad_identifiers.proto", False)
   generate_proto("google/protobuf/pyext/python.proto", False)
@@ -113,9 +120,7 @@
       for filename in filenames:
         filepath = os.path.join(dirpath, filename)
         if filepath.endswith("_pb2.py") or filepath.endswith(".pyc") or \
-          filepath.endswith(".so") or filepath.endswith(".o") or \
-          filepath.endswith('google/protobuf/compiler/__init__.py') or \
-          filepath.endswith('google/protobuf/util/__init__.py'):
+          filepath.endswith(".so") or filepath.endswith(".o"):
           os.remove(filepath)
     # _clean is an old-style class, so super() doesn't work.
     _clean.run(self)
@@ -137,12 +142,6 @@
     generate_proto("../src/google/protobuf/wrappers.proto")
     GenerateUnittestProtos()
 
-    # Make sure google.protobuf/** are valid packages.
-    for path in ['', 'internal/', 'compiler/', 'pyext/', 'util/']:
-      try:
-        open('google/protobuf/%s__init__.py' % path, 'a').close()
-      except EnvironmentError:
-        pass
     # _build_py is an old-style class, so super() doesn't work.
     _build_py.run(self)
 
@@ -157,33 +156,78 @@
     status = subprocess.check_call(cmd, shell=True)
 
 
+def get_option_from_sys_argv(option_str):
+  if option_str in sys.argv:
+    sys.argv.remove(option_str)
+    return True
+  return False
+
+
 if __name__ == '__main__':
   ext_module_list = []
-  cpp_impl = '--cpp_implementation'
   warnings_as_errors = '--warnings_as_errors'
-  if cpp_impl in sys.argv:
-    sys.argv.remove(cpp_impl)
-    extra_compile_args = ['-Wno-write-strings', '-Wno-invalid-offsetof']
+  if get_option_from_sys_argv('--cpp_implementation'):
+    # Link libprotobuf.a and libprotobuf-lite.a statically with the
+    # extension. Note that those libraries have to be compiled with
+    # -fPIC for this to work.
+    compile_static_ext = get_option_from_sys_argv('--compile_static_extension')
+    libraries = ['protobuf']
+    extra_objects = None
+    if compile_static_ext:
+      libraries = None
+      extra_objects = ['../src/.libs/libprotobuf.a',
+                       '../src/.libs/libprotobuf-lite.a']
     test_conformance.target = 'test_python_cpp'
 
+    extra_compile_args = []
+
+    if sys.platform != 'win32':
+        extra_compile_args.append('-Wno-write-strings')
+        extra_compile_args.append('-Wno-invalid-offsetof')
+        extra_compile_args.append('-Wno-sign-compare')
+
+    # https://github.com/Theano/Theano/issues/4926
+    if sys.platform == 'win32':
+      extra_compile_args.append('-D_hypot=hypot')
+
+    # https://github.com/tpaviot/pythonocc-core/issues/48
+    if sys.platform == 'win32' and  '64 bit' in sys.version:
+      extra_compile_args.append('-DMS_WIN64')
+
+    # MSVS default is dymanic
+    if (sys.platform == 'win32'):
+      extra_compile_args.append('/MT')
+
     if "clang" in os.popen('$CC --version 2> /dev/null').read():
       extra_compile_args.append('-Wno-shorten-64-to-32')
 
+    v, _, _ = platform.mac_ver()
+    if v:
+      extra_compile_args.append('-std=c++11')
+    elif os.getenv('KOKORO_BUILD_NUMBER') or os.getenv('KOKORO_BUILD_ID'):
+      extra_compile_args.append('-std=c++11')
+
     if warnings_as_errors in sys.argv:
       extra_compile_args.append('-Werror')
       sys.argv.remove(warnings_as_errors)
 
     # C++ implementation extension
-    ext_module_list.append(
+    ext_module_list.extend([
         Extension(
             "google.protobuf.pyext._message",
             glob.glob('google/protobuf/pyext/*.cc'),
             include_dirs=[".", "../src"],
-            libraries=['protobuf'],
+            libraries=libraries,
+            extra_objects=extra_objects,
             library_dirs=['../src/.libs'],
             extra_compile_args=extra_compile_args,
-        )
-    )
+        ),
+        Extension(
+            "google.protobuf.internal._api_implementation",
+            glob.glob('google/protobuf/internal/api_implementation.cc'),
+            extra_compile_args=extra_compile_args + ['-DPYTHON_PROTO2_CPP_IMPL_V2'],
+        ),
+    ])
     os.environ['PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION'] = 'cpp'
 
   # Keep this list of dependencies in sync with tox.ini.
@@ -196,15 +240,15 @@
       name='protobuf',
       version=GetVersion(),
       description='Protocol Buffers',
+      download_url='https://github.com/google/protobuf/releases',
       long_description="Protocol Buffers are Google's data interchange format",
       url='https://developers.google.com/protocol-buffers/',
       maintainer='protobuf@googlegroups.com',
       maintainer_email='protobuf@googlegroups.com',
-      license='New BSD License',
+      license='3-Clause BSD License',
       classifiers=[
         "Programming Language :: Python",
         "Programming Language :: Python :: 2",
-        "Programming Language :: Python :: 2.6",
         "Programming Language :: Python :: 2.7",
         "Programming Language :: Python :: 3",
         "Programming Language :: Python :: 3.3",
diff --git a/python/tox.ini b/python/tox.ini
index cf8d540..4eb319d 100644
--- a/python/tox.ini
+++ b/python/tox.ini
@@ -1,18 +1,20 @@
 [tox]
 envlist =
-    py{26,27,33,34}-{cpp,python}
+    py{27,33,34,35,36}-{cpp,python}
 
 [testenv]
 usedevelop=true
-passenv = CC
+passenv = 
+    CC KOKORO_BUILD_ID KOKORO_BUILD_NUMBER
 setenv =
     cpp: LD_LIBRARY_PATH={toxinidir}/../src/.libs
     cpp: DYLD_LIBRARY_PATH={toxinidir}/../src/.libs
     cpp: PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp
+    python: PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
 commands =
     python setup.py -q build_py
     python: python setup.py -q build
-    cpp: python setup.py -q build --cpp_implementation --warnings_as_errors
+    cpp: python setup.py -q build --cpp_implementation --warnings_as_errors --compile_static_extension
     python: python setup.py -q test -q
     cpp: python setup.py -q test -q --cpp_implementation
     python: python setup.py -q test_conformance