BlogRush - Is It Worth It?

posted on 10/09/07 at 10:53:33 pm by Joel Ross

A few weeks ago, I got some ambition to update my blog skin - minor tweaks, really, but one of the things I did add was a BlogRush widget (FYI, that link includes my referral code). If you're in a feed reader and curious, click through to see it.

BlogRush is kind of like AdSense in that it displays content that should be of interest to people reading my blog. A lot of the links in the widget are something I would be interested in, so it definitely works. My question of whether it's worth it is because since about the day I signed up, they've been slammed with new sign ups, and I haven't been able to see much about what I'm contributing or what I'm getting in return. I know my web traffic has been going up by roughly 5-10% per week over the past few weeks, and subscribers are up in the past month as well, but I can't be sure if it's BlogRush or just that I've been posting more lately - which tends to increase my traffic anyway.

For example, today, it says I have 101 credits - and 813 for the past 7 days. But there's not much details about what that gets me, really. And the reports have said there's not enough info yet, so that doesn't help either. They're supposed to be rolling out new features and better hardware, so I think I'm going to wait it out, but if it doesn't happen soon or I don't get a good feeling about what I'm getting for my contributions, I may have to bail.

Categories: Blogging, Software


 

07-08 Week 5 NFL Pick Results

posted on 10/09/07 at 10:32:05 pm by Joel Ross

I had a pretty good week this week. I was 12-2 picking games, which is a tie for the best I've ever done in 14 game weeks - oddly, with week 5 last year. Maybe I'm starting to finally get a feel for what teams are going to be good this year? Or more likely, maybe I just got lucky. I still didn't do much against the spread, and not all that well in over/under either. So I know who'll win - just no clue by how much or how much scoring will happen!

Last week for my "lock solid" picks, I made 4 safe picks - New York Giants, Indianapolis, Baltimore and Dallas. Dallas pulled it out at the last minute, so I ended up 4-0! That brings me to 15-5 for the season. Based on those games, I was up $13.75, getting me back in the black at $2.55.

  • Detroit 3, Washington 34 (-3) (46.5 O/U) [P: $5.26, S: $10.00, O/U: $9.09, T: $24.35]
  • Cleveland* 17, New England 34 (-17) (48 O/U) [P: $0.63, S: $0.00, O/U: $9.09, T: $9.72]
  • Seattle 0, Pittsburgh 21 (-5.5) (39.5 O/U) [P: $4.08, S: $10.00, O/U: ($10.00), T: $4.08]
  • Arizona 34 (-3.5), St. Louis 31 (41 O/U) [P: $5.41, S: ($10.00), O/U: ($10.00), T: ($14.59)]
  • Jacksonville 17 (-1), Kansas City 7 (36 O/U) [P: $7.41, S: $10.00, O/U: ($10.00), T: $7.41]
  • Carolina 16, New Orleans 13 (-3.5) (44.5 O/U) [P: $16.00, S: $10.00, O/U: $9.09, T: $35.09]
  • New York Jets 24, New York Giants 35 (-3) (41 O/U) [P: $5.26, S: $10.00, O/U: $9.09, T: $24.35]
  • Miami 19, Houston 22 (-6.5) (43.5 O/U) [P: $4.08, S: ($10.00), O/U: $9.09, T: $3.17]
  • Atlanta 13, Tennessee 20 (-8.5) (40.5 O/U) [P: $2.38, S: ($10.00), O/U: ($10.00), T: ($17.62)]
  • Tampa Bay 14, Indianapolis 33 (-10) (46.5 O/U) [P: $1.82, S: $10.00, O/U: $9.09, T: $20.91]
  • Baltimore 9 (-3), San Francisco 7 (35.5 O/U) [P: $5.00, S: ($10.00), O/U: ($10.00), T: ($15.00)]
  • San Diego 41, Denver 3 (0) (42.5 O/U) [P: ($10.00), S: ($10.00), O/U: ($10.00), T: ($30.00)]
  • Chicago 27, Green Bay 20 (-3.5) (41 O/U) [P: ($10.00), S: ($10.00), O/U: ($10.00), T: ($30.00)]
  • Dallas 25 (-11), Buffalo 24 (45 O/U) [P: $1.67, S: ($10.00), O/U: $9.09, T: $0.76]

Results Summary

  • Picks (this week): 12 - 2 (85.71%) - Winnings: $38.99
  • Picks (season): 47 - 29 (61.84%) - Winnings: ($57.26)
  • Spread (this week): 6 - 7 (46.15%) - Winnings: ($10.00)
  • Spread (season): 30 - 39 (43.48%) - Winnings: ($90.00)
  • Over/Under (this week): 7 - 7 (50.00%) - Winnings: ($6.36)
  • Over/Under (season): 33 - 40 (45.21%) - Winnings: ($100.00)
  • Total Weekly Winnings: $22.63
  • Total Overall Winnings: ($247.26)

See you in a few days with a whole new set of picks!

Tags: | |

Categories: Football


 

Decreasing Developer Ramp Up Time

posted on 10/09/07 at 09:50:16 pm by Joel Ross

Dave Donaldson has a great post about ways to decrease the amount of time it takes to get a developer up to speed on a new project. The post has been up for almost two months now, but I'm just getting around to writing about it.

Anyway, he lists out a few tips that are all great:

  • Explain the problem
  • Review the architecture
  • Maintain a Wiki
  • Provide good equipment
  • Get latest and run it
  • Run tests
  • Pair up

This works great for an Agile approach to development, which isn't surprising coming from Dave. If you're doing what you should be on your project, none of these should be a big deal, and getting a developer up to speed shouldn't be that hard.

There's one other thing that I would add to this list, but it's mainly because of the way I do development - a virtual PC disk that can be copied and given to the new developer. It's not a substitute for ensuring that you have all of the tools and source code documented and readily available, but if you have a brand new developer with a brand new machine, all they need is an OS and Virtual PC and they are ready to go. You can pre-configure all of the tools, source control access, and the initial source code. I do this anyway for my own virtual disk, so I'd just copy it off once it's ready, and now any new developer coming onto the project no longer has to install and configure their environment - this could save a day or more, depending on the requirements.

Anyway, go read the rest of the post. He's got more details for each bullet.

Categories: Consulting, Development


 

Note To Self: Don't Delete FeedStation's Setting File While It's Running!

posted on 10/09/07 at 08:48:16 pm by Joel Ross

The Wife is gone tonight, so one of the things I did after The Girls went to bed was to finish of my last podcast in my queue - emptying my queue for the first time since before The Boy was born two months ago. Before he was born, I was pretty good about finishing up my queue on a regular (almost daily) basis, but I went three weeks without listening to any podcasts, so I had a bit of a backlog.

Feeling pretty good, I went into my podcast directory, and cleaned it out. There are several podcast folders out there that I either unsubscribed from or had "podfaded" in the past few months. Getting overzealous, I selected everything in the folder and deleted it. Then, to clean up, I emptied the recycle bin.

A little later, a new podcast came through (no more empty queue!), but FeedStation gave me an error. It'd never done that before! It's one of those programs that "just works" and somehow, I broke it! I shut it down and restarted it, and all was good. I checked the podcast folder, and realized what I'd done. It creates a Settings folder, and puts a file called FeedStationHistory.fsqueue in the directory. I deleted that file while FeedStation was running, and it apparently didn't like that.

And now, there's two new podcasts in the queue. I guess nothing lasts forever!

Categories: Podcasting, Software


 

Backing Up SQL Express Databases On The Fly

posted on 10/08/07 at 11:49:19 pm by Joel Ross

On a recent project, we created a stand-alone application that's primarily meant for one person to use. Since it's a database-driven application, we wanted to add the ability for the user to back up their databases easily. Sounds simple, right? SQL Express databases are just mdf files - simply copy the file to a new location, and it's backed up, right? In theory, yes. In practice, it's not quite that simple. We connect to our database in a user instance, and we auto-attach it, meaning when our connection is closed, it's automatically detached for us - nice and clean. No extra code for us to worry about.

That is, unless you want to back that file up. SQL Server does some things for you that are, under most circumstances, very helpful. For example, connection pooling. You close a connection, but the connection isn't really closed - it's thrown back into the pool to be grabbed later - it definitely improves performance. Luckily, there's a static method on the SqlConnection class that will clear the connections in the pool - SqlConnection.ClearAllPools().

Great! Now we can copy the file, right? Nope. It's in use. As I said earlier, SQL Server Express is kind enough to automatically attach and detach our database for us, but what isn't obvious (but explained here, near the bottom) is that it can take 8-10 minutes for it to be detached - meaning you can't copy it until that time.

To finish this up then, you just clear the pool, detach the database, and back up the file. That's the gist of it, but the details are still gnarly at best. Here's the code we used to get the job done:

string baseConnectionString = "Data Source=.\\SQLExpress;Integrated Security=SSPI;User Instance=true;";

SqlConnection.ClearAllPools();

using (SqlHelper helper = new SqlHelper(baseConnectionString))
{
string commandText = @"
declare @count int;
select @count = count([name]) from sys.databases where [name] = @DBName;

if @count = 1
begin
exec sp_detach_db @DBName;
end

";

List

<SqlParameter> parameters = new List<SqlParameter>();
parameters.Add(
new SqlParameter("@DBName", databaseFile));
helper.Execute(commandText, CommandType.Text,
ref parameters);
}


A few things to note here:

  1. When a database is auto-attached to SQL Express, it's name is the file name - path and all. As an aside, since it's a user instance, you can't do much to see what's going on in SQL Management Studio - looking at the list of attached databases doesn't show the auto attached databases, since they are actually attached in a separate user instance.
  2. You can't assume the database is attached. The article above says it's released in 8 to 10 minutes, but my experience is that's pretty random - I saw times from one minute up to 10 minutes. Bottom line: check to see if your file is listed in sys.databases, and only try to detach if it is.
  3. This is probably the most important one to get this to work. Our connection string doesn't specify a database - just the server information. Then you can detach the database, and your current connection can be held in the (cleared and now) rebuilding connection pool without holding onto the file reference.

Once you've done this, you can just copy the database file to a backup location:

System.IO.FileInfo file = new System.IO.FileInfo(databaseFile);
file.CopyTo(destinationFile,
true);

file

= new System.IO.FileInfo(databaseFilet.Replace(".mdf", "_log.ldf"));
file.CopyTo(destinationFile.Replace(
".mdf", "_log.ldf"), true);


We backup the mdf file and the log file, so that's the two lines above.

This is a nice little way for us to backup databases. We also embedded our database in our application, so it can be created on the fly. It worked out well for us, so if you ever need to back up SQL Express databases on the fly, this is how you can do it.

Categories: Development, C#


 

Announcing the Release of The NuSoft Framework 2.0

posted on 10/08/07 at 10:20:01 pm by Joel Ross

The team at NuSoft Solutions has been working hard to get ready for the release of first public version of the NuSoft Framework, and today, we pulled the trigger - NuSoft Framework 2.0 has been released!

The documentation on the site is a little light, but should be enough to get you started. We've tested it fairly thoroughly with CodeSmith 4.1.2, but it should work with version 3.2 or later.

Some of the new features we added (since 1.0, which was an internal release to NuSoft) are:

  • Support for tables without primary keys. These are generated as read only entities. As part of this, we obviously added support for entities that are read-only. In the future, we could use this for look up tables, which shouldn't really be modified, just referenced.
  • Control naming of your entities, entity collections, and properties. We figure out names automatically for you, but if you don't like the names we come up with, you can use database extended properties to control the names.
  • Exposed a few previously internal methods to return primary key property information.
  • Changed our lists to inherit from BindingList for better support for data binding. We also added Find<>, FindAll<>, ProxyAs<>, etc. methods on the list, to make working with them a little easier.
  • Added a few more events to hook into the framework a little easier.
  • Added a few more partial classes to allow you to extend the framework.

Those are the highlights. I'm really excited about this, and will be upgrading a few projects I have using the NuSoft Framework over the next few days, as well as working on the site's documentation.

If you head over there and download the release, let me know what your thoughts are.

The others involved with this release were Rick Krause, Mark Jordan, Brian Anderson and Aaron K. (no blog - yet!).

Categories: Development, C#, RCM Technologies


 

Is vs. As Performance

posted on 10/08/07 at 12:34:22 am by Joel Ross

Sometimes it has to be completely laid out for me to finally "get it". That's what happened tonight when I was reading Adel Khalil's blog post about the performance of "is" vs. "as".

When you read the post, it's obvious what's happening, but when I've used "is" in the past, I've never really thought about it. I think the reason I've picked one over the other is because you're taught the proper way to develop is to check things before you use them. If you use an object, check it's not null. If you get parameters, check them before you use them.

So it seemed natural that if you are casting from one type to another, you check that it's of that type, right? Not in this case - it's more performant (and just as safe) to do the check for null after you try the cast.

I'll have to remember that in the future.

Categories: Development, C#


 

MVC For .NET

posted on 10/08/07 at 12:33:17 am by Joel Ross

At the ALT.NET conference, Scott Guthrie announced that Microsoft is going to be making an MVC framework available for ASP.NET. Since I wasn't there, I'm not sure what exactly that looks like, but the blogosphere is abuzz with little tidbits. It sounds like it should be an interesting implementation. I wonder how many ideas were borrowed from the Web Client Software Factory. There's a full list at Jeffrey Palermo's blog, but here's a few highlights:

  • Provide ASPX without viewstate or postbacks for the views.
  • Provide hooks for your own views
  • Inherent support for IoC for container creation and DI on the controller (presumably for the view).
  • URL and Navigation control

There's others. Go read Jeffrey's post for the full list. I picked these because they are the most interesting to me. First, ASPX without postbacks. I guess I'll have to see what that means - maybe it just means no postback specific methods? If you can hook up  your own view, could you (theoretically - I'm not saying this is the best way or even logical) hook up a windows UI? What about web services, meaning your ultimate UI could be just about anything or even unknown to the original developer? I've looked at the WCSF's implementation of IoC and it's pretty slick (it uses the ObjectBuilder DI Framework). Again, I wonder if any of that is used in this. The URL processing looks different as well - you don't have pages in a particular place, but the way the URL comes through determines what controller and action to take.

All in all, it looks interesting, but considering it's beyond the Orcas release, it's still out there a ways. Not something we'll be playing with soon, at least.

Categories: ASP.NET, Development, C#


 

.NET 3.5 Source Code - Good or Bad?

posted on 10/07/07 at 10:08:56 pm by Joel Ross

I was sitting in a lunch meeting when I heard the announcement that the .NET Framework was going open source, which isn't quite true. It's being released as a source reference, meaning you can't contribute to it, but you can see what's going on. I was excited - not so much because I want to see the source - I can already do that (legally or not) using Reflector, a tool that most developers already know about. No, I was excited because I would be able to step into framework code when needed. That need doesn't arise often, but there have been times that I've opened Reflector to see what's happening, and "stepping into" framework code manually is a painful process, and being able to run it live would be nice.

Everywhere you look, people are praising Microsoft - the big knock being that it's not really open source - community developers can't contribute fixes or features to the framework team, so it's not a true open source project. Other than that, it's all been positive.

That is until I read Frans Bouma's take. The title says it all: "Don't look at the sourcecode of .NET licensed under the 'Reference license'". His reason: Software patents. You're potentially looking at code that is patented, and if you build something similar, you could be held liable. He has a good example about the ReaderWriterLockSlim class that's been added to .NET 3.5. It's patented and has issues. If you write your own and have looked at the source, you could be liable for creating derivative work from patented code.

Now, you could say that Frans is being a little paranoid, but he does bring up some good points. I'm not sure how I feel about it personally, but it's definitely something to keep in mind, especially if you are a framework developer, like Frans is (and, well, so am I).

Categories: Development, C#


 

Encapsulating Fields - A Reason To Do It

posted on 10/04/07 at 09:28:00 pm by Joel Ross

Last night, I posted a question, "Why do you encapsulate fields?" asking why you should bother encapsulating private fields. I couldn't think of any solid reason to do it, but this morning, Matt Blodgett, who now works with me at NuSoft Solutions, pointed me to a Jeff Atwood post questioning the same thing. Jeff initially says you shouldn't write useless code, but then later updates it with reasons it's not useless to encapsulate fields:

  • Reflection
  • Databinding
  • Changing from fields to properties is a breaking change.

He links to an Abhinaba Basu post explaining that if you create a class with public fields and distribute it, and then later change it to use properties, and re-distribute it, even though it's not exactly obvious that you're breaking the interface, you are - so all clients have to recompile against the updated DLL even though no code changes are involved.

The other two bullets are important too. If you have a mixture of public fields (because they have no "need" to be properties) and properties (because they do something else when getting or setting), reflection isn't as simple - and changing from a field to a property could break your reflection code. Also, you can only databind to properties - the same way a web service proxy doesn't propagate fields.

At least I have a reason why I do what I do now!

Categories: Development, C#


 

<< 1 ... 20 21 22 23 24 25 26 27 28 29 30 ... 124 >>