After continuing with the previous article "Developing ASP. NET vNext (using Visual Studio 2014 CTP1,
About cloud optimization and Version Control:
I wanted to perform a self-host test on MAC and LINUX, but the official saying that the MONO version of the runtime environment requires at least 3.4.1. I bought a table last year until this article is published, do you want me to go to hell to find 3.4.1? I spent the night with 3.4.0, and MAC stayed at httpapi. dll error. Ubuntu Server 12.0.4 does not recognize several of the DLL packages. Forget which of them are. If you have a stable version later, try again.
However, I have probably understood the concept of "cloud optimization". Let's take a look at the complete steps to run a project in a non-windows environment:
In step 3, different systems will get different packages,This is the so-called cloud Optimization.
If you need to update the package version later, use kpm upgrade to update the package. It is still manual, ". net 4.5 core "the most core namespace is to load all the packages required by other operating environments from the system, which are stored in your own directory, load and run from your own directory, which is equivalent"Green Edition"ASP. NET, for example, some functions are useless in the past, even if you put the dll in the bin, it must be installed through the system. Now, it can 100% ensure that all the functions you need can be solved by deploying the dll to the bin directory. The practical significance of this solution is: aws and azure linux Hosts do not allow us to run root users. There are always some things that we cannot install even sudo. This is just one of them. So far, we can understand what I mean, we can deploy different versions of the same application to different directories, bind different domain names, and completely isolate their runtime context,This means that each application can run on different versions..
Concerns about loading packages from nuget:
It will not be automatically loaded to nuget when it is started for the first time, nor will it be automatically loaded to nuget when it is restarted again next time. All actions involving nuget need to be manually controlled by yourself, during restore, your application can run only after 100% is successfully loaded. Otherwise, rollback and upgrade must also be 100% successful. Otherwise, rollback is performed, you can think of it as a "transaction" to understand it. It will not affect your application in terms of "loading.
This nuget is not another nuget. By default, it will load the package to the nuget official server, but kpm restore and upgrade can specify different source addresses. If you want to, you can specify 127.0.0.1. So far, there is no need to worry about deployment.
About VS2014:
According to Microsoft's unspoken rules, when a component is still in beta testing, its namespace is Microsoft. XXX. XXX. After the official version is released, it will become System. XXX. XXX. Currently, many packages are under Microsoft.
My current VS2014 CTP1 is successfully installed only after the 2013, 2012, and 2010 of the local machine is uninstalled, however, I do not know whether it is because I uninstalled the wrong things or VS2014 CTP1 itself. The SQLServer on my system was sacrificed with honor. Just reinstall SQLServer again (fix and install it ), although it said that it cannot coexist with the old version of VS on the same machine, but I installed VS2013.2 again, everything works normally. (If there is any risk, do not follow suit. I am not responsible for all consequences)
Sometimes you edit a project, and everything is calm, but when you save, close vs2014, and open vs2014 again to open the project, the "Reference List" in the project will remain in the loading state, after several reboots, the ghost knows why.
Everything is right. Now that the age has entered vNext, we don't need EF 7.0 to be worthy of D and the people:
Entity Framework 7.0: more concise, hateful Migration finally leaves
(Entity Framework 7.0 can be independent of vNext and run on MVC1 ~ 5. x)
After creating a database for demonstration, let's take a look:
In the past, the biggest problem encountered when using Migration was not the Migration step itself, but from time to time: "the same table or object already exists." various solutions are strange, you even need to modify the files automatically generated by Migration. Even if you are using your own projects, do you dare to do this in a real Large-Scale Real-time production environment?
In the past, the structure Hash value of each data table was recorded in the Migration table. If you manually modify the table structure, the Entity Framework still reports an error even if the structure is consistent, because the Hash value is incorrect, it only believes in its own Migration. Although it can bypass the Migration step by disabling the data structure check during application initialization, it is still flustered, when people go out and brag, they are also afraid of being caught. Now, the Migration is gone, which means you can manually modify the table structure and synchronize it with the Code First structure. It will no longer embarrass you.
This brings another benefit: After saving this function, it also saves many development and test steps, and the EF team can focus more on EF's core functions, third-party database developers can quickly release/update the Entity Framework corresponding to their own databases.
You can still use Migration. The system separates this function and places it in the Microsoft. Data. Entity. Migration space. As for how to use it, I did not study it.
In the past, when running, Entity Framework will automatically check whether the database exists. If it does not exist, it will automatically create a database and create corresponding tables and other corresponding structures. Now we need to manually enable this function, and "create database" and "create table" are two different things, just like this:
/// <Summary> /// database operation class /// </summary> public class DBOperator: DbContext {public DBOperator () {if (! Database. exists () {Database. create (); // Create a Database. createTables (); // create a table} public DbSet <Customers> MERs {get; set;} protected override void ongrouping (DbContextOptions builder) {builder. useSqlServer (@ "Server = HP \ SQLSERVER; Database = vNextTest; Trusted_Connection = True; MultipleActiveResultSets = true ");}}
In 1 ~ 6. in Version x, you can remove the constructor. No problem. The system will automatically create databases and tables, but not now. If you delete the constructor in the code example above, the database will not be created at the beginning, which leads to the idea that in actual operation, if (! Database. Exists (), we can do this by building a multi-state structure to ensure maximum performance:
Public abstract class DBO: DbContext {public DbSet <Customers> MERs {get; set;} protected override void OnConfiguring (DbContextOptions builder) {builder. useSqlServer (@ "Server = HP \ SQLSERVER; Database = vNextTest; Trusted_Connection = True; MultipleActiveResultSets = true") ;}} public sealed class DBOperator: DBO {// use this class during actual running // save unnecessary judgment} public sealed class FirstTime: DBO {public FirstTime (){ // This class is used when the system runs for the first time. Put it in Started. cs and execute it once. if (! Database. Exists () {Database. Create (); // Create Database. CreateTables (); // Create TABLE }}}
Set project. json and enable Entity:
{"Dependencies": {"Helios": "0.1-alpha-build-0585", "Microsoft. aspNet. staticFiles ":" 0.1-alpha-build-0443 "," Microsoft. aspNet. mvc ":" 0.1-alpha-build-1268 "," Microsoft. aspNet. server. webListener ":" 0.1-alpha-build-0520 "," Microsoft. data. entity ":" 0.1-alpha-build-* ", // enable Entity" Microsoft. data. entity. sqlServer ":" 0.1-alpha-build-0863 ", // enable SQL Server" Microsoft. dataAnnotations ":" 0.1-alpha-build-0100 ", // enable DataAnnotations" Microsoft. framework. configurationModel ":" 0.1-alpha-build-0233 "// enable configuration file operations}," deployments ": {" net45 ":{}," k10 ":{}}}
Other operations such as SaveChange and Add are the same as before.
People in the same path should discard Migration from now on.
VNext runs in self-host Mode and IIS mode. Compared with traditional MVC 5.x
The following is a simple stress test on vNext using AB. It mainly involves a mental concept of throughput. The controller performs 30 iteration operations on the Fibonacci sequence, and the View is empty, only simple HTML is output. The client simulates 1000 users for a total of 10000 accesses. Before each test, an additional access is made through IE (first startup is eliminated ).
public IActionResult Index() { var x = new int[30]; var length = x.Length; for (int n = 0; n < 30; n++) { switch (n) { case 0: x[n] = 1; break; case 1: x[n] = 1; break; default: x[n] = x[n - 1] + x[n - 2]; break; } this.Context.Response.WriteAsync((x[n]).ToString() + "<br />"); } return View(); }
I have said everything in the figure. I hope the official version will be better.
Unsolved Problems
* Nix environment is really not configured, if any senior is willing to toss, please see their official instructions: https://github.com/aspnet/home