Lot of people in this thread crapping on Microsoft, but that's not to say that there's not some valid criticism here.
Microsoft is trying to unscrew the single platform nature of dotnet. This is going to take some time. They are having to break bones to set them correctly. Let's not pretend that any of us devs have never inherited an old project that we had to fix, and left a wake of destruction.
That said, shame on them for not listening to the community more. Many, many people on Github were very vocal about the project formats and the build system, and it was obvious that Microsoft made up it's mind.
While I'm a huge fan of dotnet, I also have to chastise Microsoft here. You can't leap to open source and then crap on your contributors when they are telling you that you're making a blind decision. If it was the one contrarian person that always complains about everything, I could see it, but it wasn't. Very notable members in the dotnet ecosystem raised the warning flag on this.
I for one kept with the mindset of not building anything for production with their v1 release. This is an old rule. By v2, I'm betting this thing will be awesome. Hell it already is. Let's not throw the baby out with the bathwater.
> Many, many people on Github were very vocal about the project formats and the build system...
I have not been following the project. Can somebody fill me in on what the people on GitHub were very vocal about specifically? Did they want JSON instead of XML or vice versa? Or something else?
> That said, shame on them for not listening to the community more. Many, many people on Github were very vocal about the project formats and the build system, and it was obvious that Microsoft made up it's mind.
It seems to me they have been listening the community but some of what the community wants just isn't feasible or feasible at this time. This update in particular demonstrates Microsoft is still trying to improve the experience for everyone.
Well, in my opinion they actually do need to have strong opinions because it's not "just another node framework". They want to target enterprise sooner than later and it doesn't hurt that you have huge megacorp backing the product, of course it also makes their saying much more weightful.
Also, I don't see the problem with the changes, they announced that they are going away from JSON to XML like 3 months ago and it's not like this is news.
Also, I like XML much better for project files because IMO it's much more readable.
I like the XML format better too. It's also easier* to parse if you start using XPath, if for any reasons you need to quickly read/write some of the values.
* Except when a default namespace is involved, then XPath expressions get more complicated. But according to that blog, it looks like the VS team ditched the default xmlns="http://schemas.microsoft.com/developer/msbuild/2003" namespace in the <Project> root :)
> Microsoft is trying to unscrew the single platform nature of dotnet.
This is the largest effort and the longest without overt attempts at screwing people to be customers. I am going to hold out for some more time, I am not yet convinced Microsoft is acting in good faith yet, their history is so bad.
Are they still suing companies for distributing android.
Yes, it's always worth waiting until v2 (or at least the first service pack / patch) before using something in anger, but experimenting in non-critical areas is fine. This applies to almost all software, but particularly the core stuff like OS and DB. I believe this is why the first Oracle DB was "v2". I'm not sure many people were fooled by this trick.
Stop. Before moaning about json vs xml consider that the sln file is a non-stand external dsl that's existed since the early nineties. Now that's something I'd like to see being replaced.
XML is actually fine if used sensibly, and better than JSON in some respects (date format etc.). I think people have a bad taste because of some excessive uses of XML (hello SOAP!) and namespace / schema issues.
{
"comment-version": "Tells the version of the file",
"version": "1.0.0-*",
"buildOptions": {
"comment-debugType": "What kind of debug type would you like?",
"debugType": "portable",
"emitEntryPoint": true
}
}
Those aren't comments. That's all live data, sent over the wire using up bandwidth and resources. It probably wouldn't matter for this project file, but you really shouldn't do that in json.
Correct, but I believe what they may mean is that the data will be parsed at an attempt to deserialize it. This may have unintended side-effects. For example, you have to make sure you don't want to add a property to your POCO with that name now.
The wire isn't that relevant here. This is about a config file, not an API. Size isn't too important.
This can be solved by sanitizing it prior to parsing and removing any property that starts with 'comment'. I know it's not a pretty solution, but it works. It's an additional step, but it might be preferred by some people when it means working with a data format they prefer.
Sounds like a great way to get a bug report in a few year time (once you've forgotten about this feature) along the lines of: Property names starting with "comment" fail to deserialize. :)
I still fail to see in what way JSON is better than XML other than that it's maybe a little prettier and doesn't allow people going crazy like they did with XML schemas. We should have at least replaced XML with something really better.
That aside csproj and sln files really suck. I want my good old make files back!
No, this is pretty much all new. But there are plenty of places in the tooling besides the project file where you would list off the target frameworks (e.g. nuget). I assume they're all a semi-colon separated list thus making this consistent across everything.
Glad to see a more unified approach, particularly with .NET Standard and Xamarin [0]. I've been experimenting with Xamarin.Forms recently and put together a little mobile game. There are a couple of blog posts coming on that, including the whole shared projects / PCLs / .NET Standard thing.
Hope this csproj is final breaking change on this!.I am not sure if any other project has gone through so much of changes from the initial release. It was called kvm,kre,klr [1]-> dnx,dnu,dnvm [2] ->coreclr,corefx ->changes to project.json to csproj system . I have started writing a gitbook[3] on asp.net core when this was in the RC phase but everything I learned or wrote had to be rewritten .I did not merge my dev branch to master for whatever new I was writing and eventually I stopped.I am wondering what would happen to all the books currently published where most of the book will be referring project.json
I am currently working on two dotnet core projects (both ASP.NET Core) and my only complaint has been the tooling. Visual Studio talking in terms of solutions and the dotnet cli talking in terms of projects means that I have `launch.{sh|ps1} scripts for those not using VS and launchOptions.json/the sln file for those working in VS. It feels like something that should have been sorted out, even if it took a new `dotnet startup` command to emulate Visual Studio's StartUp projects. Entity Framework Core tooling, especially with SQLite, has also caused some issues around migration handling (in that there basically is no migration handling for sqlite, causing you to write a custom script to nuke the database and migrations every time you make a change).
So...when do folks think it will be safe to use this? Often 1.0 is reserved for "we'll try to make less painful interface breaking changes after this release". But if this is any indication, we're still a ways off from API stability.
This doesnt change anything with the API and NetStandard 2.0 coming soon will add back several missing APIs to .NET Core that are in the .NET full framework.
.NET Core is safe to use right now, it's the IDE and command line utilities getting finalized for 1.0 release (of the tooling) so that it's easier to work with.
The API is stable and worth building for. The remaining upheaval seems to be in tooling: how Visual Studio works with .NET Core projects and how command line/Linux/OS X works with projects.
The existing tooling is alright enough that it is "safe" to use. This blog post reminds that the update to more final tooling should be somewhat seamless (VS will handle it for you automatically like any other project upgrade/migration; `dotnet migrate` will do the job for CLI and other platforms) and is on its way.
Ok thank you, but that doesn't seem to be the case here, because this doesn't appear to be auto-posted by some HN bot? Normally when I try to post an article that has already been posted in the last 24hours then Hacker News redirects me to the original post, so I was surprised how this could be duplicated so quickly after the original post. Also why does my question get down voted? Was I wrong to ask that question or do people feel offended by my question?
Those auto-reposted submissions look like the original submitter resubmitted them. You can usually see that their are conflicting time-stamps, some pages show the original time, some the resubmission time.
I don't know if it was the case here. It may simply be that it was a normal submission and the URL duplicate detection didn't work. It's hard to do that, if you're too strict you probably get a lot of false positives.
Questioning downvotes never leads to any real answer, so it's useless at best. And it invites further downvotes, because it's unproductive (and can seem whiny – not in your case, though).
I am regularly astounded which of my comments get many downvotes, and also which of them get many upvotes. Also sometimes comments start with -4 and get to +8 or vice versa. It's best to shrug it off entirely.
What a disaster. The tail wagged the dog in the .NET Core team. Every step of the way it became clearer that this was a pet project of some ivory-tower programmers and that we would be left to patch together this mess. Try adding a .NET Core project on your build server. You'll hate yourself by the time you get it building.
Really? We haven't had any problems with our .NET Core projects (a few web apps running on core and a bunch of libraries targeting core). In addition, ASP.NET Core (on .NET core) has been a great stack so far (using it since beta5); I feel very productive with it as well as enjoying the development experience. Why do people need to get fired for this? It doesn't seem like it will be hard to move from project.json back to (a new and improved) csproj.
Yes, Really, and I am sure there are some people who have had no problems. That doesn't invalidate the countless hours I spent making this dreadful system work. Did you put it on your build server without installing Visual Studio on it?
Our build server (TeamCity) does not have VS installed.
I can actually tell you the exact software installed on the build server:
1.) Window Server 2016 (Base OS)
2.) SQL Server 2016 Express with LocalDB only (used for longer integration tests that need to validate migrations)
3.) DotNetCore.1.0.1-SDK.1.0.0.Preview2-003133-x64.exe (.NET core SDK 1.0.1 download from https://www.microsoft.com/net/download)
4.) BuildTools_Full.exe ("Microsoft Build Tools 2015 Update 3" https://www.visualstudio.com/downloads/#d-build-tools). If you have this, then you don't need the full VS install
5.) NodeJs (for running webpack as part of 'dotnet publish')
6.) TeamCity and Octopus Deploy agents
>Did you put it on your build server without installing Visual Studio on it?
Huh?
`docker run microsoft/dotnet`. Took about 2 seconds to get it running on my Jenkins build server. In reality, it took literally no effort. My entire full stack app requires `make` and `docker` to build and nothing else. The backend is an AspNetCore API.
Everyone's been having problems with them, how can you have had none?
They seem to massively change something every 10 minutes.
I also don't get the 'massively productive' comment, in essence very little has changed since, hell even MVC 3, apart from the obsession with DI and a load of packages now don't work. If you weren't 'massively productive' before, you were probably doing something wrong. They just moved a couple of things around and now insist you use npm/bower/whatever hotness they latch onto next week instead of nuget. I'm sure we'll have another blog post about them ditching support for anything but yarn next week.
And how long before they change their minds about how controllers work again, it's been like 4 times in the last 5 years now. First MVC controllers, then Web API, then Web API 2, then OData, then .Net Core, all very similar but slightly different with slightly different ways to register them at startup and slightly different things available on the context. Apart from the utter madness that is OData, great for data tables, bloody awful for everything else.
It's getting almost as bad as javascript churn but it's one company so it doesn't make any sense.
It changed during the pre-1.0 era when the team were very clear about the potential for changes. The big changes were made with plenty of notice and short term backwards compatibility baked in. MVC 5 is still there and still being supported - deliberately choosing to use Core should have been a calculated decision based on what you needed and appetite for risk. If there's a bad influence from the world of JS it seems much more around devs deciding to go into production with "the new hotness".
Moving to npm/bower for front end stuff makes perfect sense, NuGet is a horrible way to deliver client side code, wasn't supported by the majority of library producers and was unknown by anyone who didn't have a .Net background.
I expected changes at the beta stage. I did not expect the major changes between RC1 and RC2. Try writing a book against a moving target! :)
I think all the changes now are due to that late switch of direction. RC2 went straight to 1.0 but it was a major change and should probably have had some more beta/RC releases.
It's not just change. It's difficult to make it work for a single version. It was difficult back when the build tools were a handful of powershell scripts. It was difficult when they moved to the dotnet tooling.
Microsoft shouldn't push out an RTM product and call the broken bits "Preview" to absolve themselves of the responsibility to deliver a reliable product.
I was evangelical about .NET Core. I stuck with it for years. I stuck with it when Damian admitted that he "doesn't build web apps". I stuck with it during the Release Candidates. It's only when they pushed this out the door and called it RTM that the penny dropped and I accepted the project had been mismanaged.
They announced all of this before .NET Core went RTM. Not only that, the tooling was never made RTM (still in preview). If you jumped on the bandwagon that early then you had all of this information and knew what was coming.
You should just wait until both the framework and tooling are RTM if that's the stability you need - the framework is already great but if you need the tooling then wait until it's ready.
Why complain if you understood the status and assumed the risk?
If you knew anything of the traditional release history for Microsoft you wouldn't have commented. All your post demonstrates is total ignorance of the context or history.
It was previously totally fine to use RCs for the last decade, probably way before that too, but I only have experience from then. RC in MS speak meant "pretty much finished, API won't change unless we find something desperately wrong".
Then it suddenly means basically pre-alpha, "anything can change, massively, between RC versions".
> They seem to massively change something every 10 minutes.
> in essence very little has changed since
?
The one thing Microsoft is great at is compatibility. Your existing projects can keep using the older frameworks and they're still supported and will run just fine. What is the issue? If it's about learning the new changes, then it's probably best to wait until everything (framework + tooling) are final so you only have to learn once.
I honestly have no idea what point you're trying to make.
The problem is that it's hard to understand what does what and how it does it. Things in even the same project behave differently. They introduce massive architectural changes and then abandon them. Searching for something in SO now is a crap shoot, which slightly in correct answer will you get because the asp.net team changed their mind again?
And you can have all these things running in one project, all acting differently.
I work with clients on MVC 3,4,5, with parts in web API 1 and 2 and parts in odata.
And they all behave differently.
For example, and this is just one of many, a JSON date from an MVC controller is 'Date(173737273)', from a web API it's '2016-12-26 23:00:00.0000', from an OData controller it's '2016-12-26 23:00:00.0000Z'. That Z changes behaviour btw and makes the terrible assumption your date is in the same TZ as your server.
All of them are parsed different and will return different dates in javascript.
That's one thing I don't understand with .net core. You deploy the whole .net framework along your code which then becomes static. And this is targeted at web applications primarily.
What about security vulnerabilities? Unless the developers redeploy the application we will be left with an unpatched .net stack and unpatched web server?
How is that a good idea? In an ideal world there would be an active dev team busy redeploying and patching every day behind each website. But people who think we live in this world haven't really followed the pretty much constant stream of news about major breaches because of unpatched versions of software being used everywhere.
My dotnet core containers are rebuilt from scratch and redeployed every 20 minutes or so. If there's a security bug in the base OS layer of my container or in the .NET runtime, I'll have it in less than a half hour.
Not too hard, if you have some standard practices in place.
I always enjoy wondering what sort of person downvotes a comment like this. I can't help but feel it's someone with shame/guilt for not having CI/CD in place.
But yeah, if that's too much to ask for, just don't ship the framework in your project. You can still have it installed as a system package. But to be frank, it's nearly the same problem, just in a different spot.
The .net framework of the OS gets updated with Windows Update automatically. It is not the same problem at all.
I obviously didn't downvote but I do understand the downvotes: it is unrealistic to expect every website to be actively maintained forever. I am sure he deploys a new version every 20 minutes of his current project. I would be curious to know how many versions a year he deploys of the projects on which he worked 5 or 7 years ago and from which he moved on.
The world is filled with legacy applications, libraries and websites. Pretending that the code we write today will always be actively maintained and supported is just unrealistic.
Probably not a good idea to update the framework without the application knowing. This could cause bugs and unspecified behavior. It's best to test your application against new versions and then deploy a new version of the app.
The .net framework is pretty much always 100% backward compatible. Again not a problem if you will have an active team maintaining the code. But can you even count how many dead applications and websites you have encountered in your career? At least if the OS gets updated, the application benefits from security patches.
Same nuget been a huge pain in this. I would have been nice to have a ready made msbuild script that would result in web deploy packages (we release with on premise TFS RM) but we were able to make one manually. I expect .net core to have better integration with TFS soon (if not already in the next release)
The .csproj msbuild file is here to stay? really? Why are we having to fit programming a build system around an archaic xml declarative language? The .csproj is the single worst point of the visual studio experience. Terrible format, editing experience, having to reload, having to copy paste GUIDs, manually lookup .target files ugh the whole thing is horrible.
Coming from a Linux/macOS background, this article is really hard to read. So many .Net-Buzzword things, MSBuild, project.json, csproj, sln...
I'd really appreciate if they streamlined how to get started with C# cross platform.
Those, except project.json maybe, aren't buzzwords by the typcial definition of it: they have existed for >10 years or so and are the names of a build system and a bunch of project files readable by that build system.
Microsoft is trying to unscrew the single platform nature of dotnet. This is going to take some time. They are having to break bones to set them correctly. Let's not pretend that any of us devs have never inherited an old project that we had to fix, and left a wake of destruction.
That said, shame on them for not listening to the community more. Many, many people on Github were very vocal about the project formats and the build system, and it was obvious that Microsoft made up it's mind.
While I'm a huge fan of dotnet, I also have to chastise Microsoft here. You can't leap to open source and then crap on your contributors when they are telling you that you're making a blind decision. If it was the one contrarian person that always complains about everything, I could see it, but it wasn't. Very notable members in the dotnet ecosystem raised the warning flag on this.
I for one kept with the mindset of not building anything for production with their v1 release. This is an old rule. By v2, I'm betting this thing will be awesome. Hell it already is. Let's not throw the baby out with the bathwater.