Why step into a martial arts ring if you don’t want to fight? My personal relationship to this question involves continuous internal cultivation. It is easy to speak of nonviolence when I am in a flower garden, The real internal challenge is to maintain that fundamental perspective when confronted by hostility, aggression and pain.

The Art of Learning, Josh Waitzkin

The amount on energy necessary to refute bullshit is an order of magnitude bigger than to produce it.

Alberto Brandolini

In your prayers substitute”protect us from evil” with “protect us from those who “improve”things for a salary”

Nassim N. Taleb @nntaleb

Say hi to LogFlow

LogFlow has reached a point where it is ready for public use. There is now a nuget package and a getting started documentation on github.

LogFlow on GitHub

	PM> Install-Package LogFlow

What is LogFlow. It is log management service that reads, transforms and stores logs in C# .net. It is a replacement for LogStash, if you are running on windows, and is mainly built for reading logs from file, transform them into json and then store the data in ElasticSearch and later on viewed by Kibana. LogFlow is a plugin based system that evolves around a Flows, internal DSL, that looks like this.

public class MyLogFlow : Flow
{
   public MyLogFlow()
   {
      CreateProcess("InsteadOfClassName")
         .FromInput(new FileInput("C:\\MyLogPath", Encoding.UTF8, true))
         .Then(new MyLogLineParser())
         .ToOutput(new ElasticSearchOutput(new ElasticSearchConfiguration()
         {
             Host = "localhost",
             Port = 9200,
             IndexNameFormat = @"\m\y\L\o\g\s\-yyyyMM" //new index each month
          }));

    }
}

You can read more on the github page. Yes, it is running in production today. The near future of LogFlow will be focused on getting better index capabilities in ElasticSearch and working on reporting capabilities, know when a flow is broken.

Restore database with name and file

Were doing an automated restore for running integration tests in a project I’m working on. To make the script easy to use we only wanted to use database name and path to file. Here is the script we ended up with.

Just replace MyDatabase and C:\backups\MyDatabase.bak. Here is the script we ended up with.

 IF EXISTS (SELECT * FROM sys.databases WHERE NAME='MyDatabase')
BEGIN
ALTER DATABASE MyDatabase SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
END

declare @logicalName nvarchar(128)
declare @logLogicalName nvarchar(128)


declare @fileListTable table
(
LogicalName nvarchar(128),
PhysicalName nvarchar(260),
[Type] char(1),
FileGroupName nvarchar(128),
Size numeric(20,0),
MaxSize numeric(20,0),
FileID bigint,
CreateLSN numeric(25,0),
DropLSN numeric(25,0),
UniqueID uniqueidentifier,
ReadOnlyLSN numeric(25,0),
ReadWriteLSN numeric(25,0),
BackupSizeInBytes bigint,
SourceBlockSize int,
FileGroupID int,
LogGroupGUID uniqueidentifier,
DifferentialBaseLSN numeric(25,0),
DifferentialBaseGUID uniqueidentifier,
IsReadOnl bit,
IsPresent bit,
TDEThumbprint varbinary(32)
)
insert into @fileListTable exec('RESTORE FILELISTONLY FROM DISK= N''C:\backups\MyDatabase.bak''')

select @logLogicalName = LogicalName from @fileListTable WHERE Type = 'L'
select @logicalName = LogicalName from @fileListTable WHERE Type = 'D'

declare @query varchar(max)
select @query =
'RESTORE DATABASE [MyDatabase]
FROM DISK = N''C:\backups\MyDatabase.bak'' WITH FILE = 1,
MOVE N''' + @logicalName + ''' TO N''C:\localdata\MyDatabase.mdf'',
MOVE N''' + @logLogicalName + ''' TO N''C:\localdata\MyDatabase.ldf'',
NOUNLOAD,
REPLACE,
STATS = 10'

exec (@query)

ALTER DATABASE MyDatabase SET MULTI_USER