Skip to content

Commit

Permalink
Fix some asciidoc issues
Browse files Browse the repository at this point in the history
  • Loading branch information
ericbottard committed Jul 28, 2017
1 parent 5cffe8b commit ae3ea9f
Show file tree
Hide file tree
Showing 5 changed files with 6 additions and 11 deletions.
2 changes: 0 additions & 2 deletions spring-cloud-dataflow-docs/src/main/asciidoc/.gitignore

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,6 @@
[[howto]]
== '`How-to`' guides

[partintro]
--
This section provides answers to some common '`how do I do that...`' type of questions
that often arise when using Spring Cloud Data Flow.

Expand All @@ -14,7 +12,6 @@ the `spring-cloud-dataflow` tag).

We're also more than happy to extend this section; If you want to add a '`how-to`' you
can send us a {github-code}[pull request].
--

=== Configure Maven Properties

Expand Down Expand Up @@ -357,4 +354,4 @@ publish the payload to Apache Kafka _(i.e., identified by `kafka1`)_, we are sup
and `output` channel settings respectively.

NOTE: The queue `fooRabbit` in RabbitMQ is where the stream is consuming events from and the topic
`barKafka` in Apache Kafka is where the data is finally landing.
`barKafka` in Apache Kafka is where the data is finally landing.
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
[[configuration]]
= Server Configuration

[partintro]
--
In this section you will learn how to configure Spring Cloud Data Flow server's features such as the relational database to use and security.
Expand Down
3 changes: 1 addition & 2 deletions spring-cloud-dataflow-docs/src/main/asciidoc/streams.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
= Streams

[partintro]

--
This section goes into more detail about how you can create Streams which are a collection of
http://cloud.spring.io/spring-cloud-stream/[Spring Cloud Stream]. It covers topics such as
Expand Down Expand Up @@ -1040,4 +1039,4 @@ and the stream will then funnel the data from the http source to the output log
2016-06-01 09:50:26.810 INFO 79654 --- [ kafka-binder-] log.sink : goodbye
```

Of course, we could also change the sink implementation. You could pipe the output to a file (`file`), to hadoop (`hdfs`) or to any of the other sink apps which are available. You can also define your own apps.
Of course, we could also change the sink implementation. You could pipe the output to a file (`file`), to hadoop (`hdfs`) or to any of the other sink apps which are available. You can also define your own apps.
6 changes: 3 additions & 3 deletions spring-cloud-dataflow-docs/src/main/asciidoc/tasks.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,9 @@ typical lifecycle for tasks in the context of Spring Cloud Data Flow:
1. Creating a Task Application
2. Registering a Task Application
3. Creating a Task Definition
3. Launching a Task
4. Reviewing Task Executions
5. Destroying a Task Definition
4. Launching a Task
5. Reviewing Task Executions
6. Destroying a Task Definition

=== Creating a Task Application
While Spring Cloud Task does provide a number of out of the box applications (via the
Expand Down

0 comments on commit ae3ea9f

Please sign in to comment.